COMM_SRTUCTURE is uknown when migrating data flow bw 3.x to 7.4

Dear ALL,
2LIS_13_VDHDR data flow migrating 3.x to 7.x , ABAP syntax error COMM_STRUCTURE is unknown infosoure transformation level, present we are using 7.4 sp5 . after migration ABAP code
TYPES:
      BEGIN OF _ty_s_TG_1_full,
*      InfoObject: 0CHNGID Change Run ID.
        CHNGID           TYPE /BI0/OICHNGID,
*      InfoObject: 0RECORDTP Record type.
        RECORDTP           TYPE /BI0/OIRECORDTP,
*      InfoObject: 0REQUID Request ID.
        REQUID           TYPE /BI0/OIREQUID,
*      InfoObject: 0CALDAY Calendar Day.
        CALDAY           TYPE /BI0/OICALDAY,
*      InfoObject: 0CALMONTH Calendar Year/Month.
        CALMONTH           TYPE /BI0/OICALMONTH,
*      InfoObject: 0CALWEEK Calendar year / week.
        CALWEEK           TYPE /BI0/OICALWEEK,
*      InfoObject: 0FISCPER Fiscal year / period.
        FISCPER           TYPE /BI0/OIFISCPER,
*      InfoObject: 0FISCVARNT Fiscal year variant.
        FISCVARNT           TYPE /BI0/OIFISCVARNT,
*      InfoObject: 0BILLTOPRTY Bill-to party.
        BILLTOPRTY           TYPE /BI0/OIBILLTOPRTY,
*      InfoObject: 0COMP_CODE Company code.
        COMP_CODE           TYPE /BI0/OICOMP_CODE,
*      InfoObject: 0DISTR_CHAN Distribution Channel.
        DISTR_CHAN           TYPE /BI0/OIDISTR_CHAN,
*      InfoObject: 0DOC_CATEG Sales Document Category.
        DOC_CATEG           TYPE /BI0/OIDOC_CATEG,
*      InfoObject: 0PLANT Plant.
        PLANT           TYPE /BI0/OIPLANT,
*      InfoObject: 0SALESORG Sales Organization.
        SALESORG           TYPE /BI0/OISALESORG,
*      InfoObject: 0SALES_GRP Sales group.
        SALES_GRP           TYPE /BI0/OISALES_GRP,
*      InfoObject: 0SALES_OFF Sales Office.
        SALES_OFF           TYPE /BI0/OISALES_OFF,
*      InfoObject: 0SHIP_TO Ship-To Party.
        SHIP_TO           TYPE /BI0/OISHIP_TO,
*      InfoObject: 0SOLD_TO Sold-to party.
        SOLD_TO           TYPE /BI0/OISOLD_TO,
*      InfoObject: 0VERSION Version.
        VERSION           TYPE /BI0/OIVERSION,
*      InfoObject: 0VTYPE Value Type for Reporting.
        VTYPE           TYPE /BI0/OIVTYPE,
*      InfoObject: 0DIVISION Division.
        DIVISION           TYPE /BI0/OIDIVISION,
*      InfoObject: 0MATERIAL Material.
        MATERIAL           TYPE /BI0/OIMATERIAL,
*      InfoObject: 0SHIP_POINT Shipping point.
        SHIP_POINT           TYPE /BI0/OISHIP_POINT,
*      InfoObject: 0PAYER Payer.
        PAYER           TYPE /BI0/OIPAYER,
*      InfoObject: 0DOC_CLASS Document category /Quotation/Order/Deliver
*y/Invoice.
        DOC_CLASS           TYPE /BI0/OIDOC_CLASS,
*      InfoObject: 0DEB_CRED Credit/debit posting (C/D).
        DEB_CRED           TYPE /BI0/OIDEB_CRED,
*      InfoObject: 0SALESEMPLY Sales Representative.
        SALESEMPLY           TYPE /BI0/OISALESEMPLY,
*      InfoObject: 0SUBTOT_1S Subtotal 1 from pricing proced. for condit
*ion in stat. curr..
        SUBTOT_1S           TYPE /BI0/OISUBTOT_1S,
*      InfoObject: 0SUBTOT_2S Subtotal 2 from pricing proced. for condit
*ion in stat. curr..
        SUBTOT_2S           TYPE /BI0/OISUBTOT_2S,
*      InfoObject: 0SUBTOT_3S Subtotal 3 from pricing proced.for conditi
*on in stat. curr..
        SUBTOT_3S           TYPE /BI0/OISUBTOT_3S,
*      InfoObject: 0SUBTOT_4S Subtotal 4 from pricing proced. for condit
*ion in stat. curr..
        SUBTOT_4S           TYPE /BI0/OISUBTOT_4S,
*      InfoObject: 0SUBTOT_5S Subtotal 5 from pricing proced. for condit
*ion in stat. curr..
        SUBTOT_5S           TYPE /BI0/OISUBTOT_5S,
*      InfoObject: 0SUBTOT_6S Subtotal 6 from pricing proced. for condit
*ion in stat. curr..
        SUBTOT_6S           TYPE /BI0/OISUBTOT_6S,
*      InfoObject: 0OPORDQTYBM Open orders quantity in base unit of meas
*ure.
        OPORDQTYBM           TYPE /BI0/OIOPORDQTYBM,
*      InfoObject: 0OPORDVALSC Net value of open orders in statistics cu
*rrency.
        OPORDVALSC           TYPE /BI0/OIOPORDVALSC,
*      InfoObject: 0QUANT_B Quantity in base units of measure.
        QUANT_B           TYPE /BI0/OIQUANT_B,
*      InfoObject: 0DOCUMENTS No. of docs.
        DOCUMENTS           TYPE /BI0/OIDOCUMENTS,
*      InfoObject: 0DOC_ITEMS Number of Document Items.
        DOC_ITEMS           TYPE /BI0/OIDOC_ITEMS,
*      InfoObject: 0NET_VAL_S Net value in statistics currency.
        NET_VAL_S           TYPE /BI0/OINET_VAL_S,
*      InfoObject: 0COST_VAL_S Cost in statistics currency.
        COST_VAL_S           TYPE /BI0/OICOST_VAL_S,
*      InfoObject: 0GR_WT_KG Gross weight in kilograms.
        GR_WT_KG           TYPE /BI0/OIGR_WT_KG,
*      InfoObject: 0NT_WT_KG Net weight in kilograms.
        NT_WT_KG           TYPE /BI0/OINT_WT_KG,
*      InfoObject: 0VOLUME_CDM Volume in cubic decimeters.
        VOLUME_CDM           TYPE /BI0/OIVOLUME_CDM,
*      InfoObject: 0HDCNT_LAST Number of Employees.
        HDCNT_LAST           TYPE /BI0/OIHDCNT_LAST,
*      InfoObject: 0CRM_PROD Product.
        CRM_PROD           TYPE /BI0/OICRM_PROD,
*      InfoObject: 0CP_CATEG Category.
        CP_CATEG           TYPE /BI0/OICP_CATEG,
*      InfoObject: 0FISCYEAR Fiscal year.
        FISCYEAR           TYPE /BI0/OIFISCYEAR,
*      InfoObject: 0BP_GRP BP: Business Partner Group (from Hierarchy).
        BP_GRP           TYPE /BI0/OIBP_GRP,
*      InfoObject: 0STAT_CURR Statistics Currency.
        STAT_CURR           TYPE /BI0/OISTAT_CURR,
*      InfoObject: 0BASE_UOM Base Unit of Measure.
        BASE_UOM           TYPE /BI0/OIBASE_UOM,
*      InfoObject: 0PROD_CATEG Product Category.
        PROD_CATEG           TYPE /BI0/OIPROD_CATEG,
*      InfoObject: 0VOLUME Volume.
        VOLUME           TYPE /BI0/OIVOLUME,
*      InfoObject: 0VOLUMEUNIT Volume unit.
        VOLUMEUNIT           TYPE /BI0/OIVOLUMEUNIT,
*      InfoObject: 0FISCPER3 Posting period.
        FISCPER3           TYPE /BI0/OIFISCPER3,
*      InfoObject: 0SALES_DIST Sales District.
        SALES_DIST           TYPE /BI0/OISALES_DIST,
*      InfoObject: 0BILL_TYPE Billing type.
        BILL_TYPE           TYPE /BI0/OIBILL_TYPE,
*      InfoObject: 0MOVE_PLANT Receiving Plant/Issuing Plant.
        MOVE_PLANT           TYPE /BI0/OIMOVE_PLANT,
*      InfoObject: 0SHIP_COND Shipping conditions.
        SHIP_COND           TYPE /BI0/OISHIP_COND,
*      InfoObject: 0AB_RFBSK Status for Transfer to Accounting.
        AB_RFBSK           TYPE /BI0/OIAB_RFBSK,
*      InfoObject: 0AB_FKSTO Indicator: Document Is Cancelled.
        AB_FKSTO           TYPE /BI0/OIAB_FKSTO,
*      InfoObject: 0CUST_GRP5 Customer Group 5.
        CUST_GRP5           TYPE /BI0/OICUST_GRP5,
*      InfoObject: ZCU_COND1 Constomer Condition Group 1.
        /BIC/ZCU_COND1           TYPE /BIC/OIZCU_COND1,
*      InfoObject: ZCU_COND2 Customer Condition Group 2.
        /BIC/ZCU_COND2           TYPE /BIC/OIZCU_COND2,
*      InfoObject: ZBATCHCD Batch Code.
        /BIC/ZBATCHCD           TYPE /BIC/OIZBATCHCD,
*      InfoObject: 0BATCH Batch number.
        BATCH           TYPE /BI0/OIBATCH,
*      InfoObject: ZBATCH Batch number.
        /BIC/ZBATCH           TYPE /BIC/OIZBATCH,
*      Field: RECORD Data record number.
        RECORD           TYPE RSARECORD,
      END   OF _ty_s_TG_1_full.
* Additional declaration for update rule interface
  DATA:
    MONITOR       type standard table of rsmonitor  WITH HEADER LINE,
    MONITOR_RECNO type standard table of rsmonitors WITH HEADER LINE,
    RECORD_NO     LIKE SY-TABIX,
    RECORD_ALL    LIKE SY-TABIX,
    SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS.
* global definitions from update rules
* TABLES: ...
DATA: IN    TYPE F,
      OUT   TYPE F,
      DENOM TYPE F,
      NUMER TYPE F.
* Def. of 'credit-documents': following doc.categ. are 'credit docs'
*   reversal invoice (N)
*   credit memo  (O)
*   internal credit memo (6)
* Credit-documents are delivered with negative sign. Sign is switched
* to positive to provide positive key-figures in the cube.
* The combination of characteristics DE_CRED and DOC-CLASS provides
* a comfortable way to distinguisch e.g. positive incoming orders or
* order returns.
* Def. der 'Soll-Dokumente': folgende Belegtypen sind 'Soll-Belege'
*   Storno Rechnung (N)
*   Gutschrift (O)
*   Interne Verrechn. Gutschr. (6)
* Soll-Dokumente werden mit negativem Vorzeichen geliefert. Um die Kenn-
* zahlen positiv in den Cube zu schreiben, wird das Vorzeich. gedreht
* Die Kombination der Merkmale DEB_CRED und DOC-CLASS gibt Ihnen die
* Möglichkeit schnell z.B. zwischen Auftrags-Eingang oder Retouren zu
* unterscheiden.
DATA: DEB_CRED(3) TYPE C VALUE 'NO6'.
FORM routine_0002
  TABLES
   P_MONITOR         structure rsmonitor
  CHANGING
    RESULT         TYPE _ty_s_TG_1_full-DOCUMENTS
    RETURNCODE     LIKE sy-subrc
    ABORT          LIKE sy-subrc
  RAISING
    cx_sy_arithmetic_error
    cx_sy_conversion_error.
* init variables
* fill the internal table "MONITOR", to make monitor entries
CLEAR RESULT.
RESULT = COMM_STRUCTURE-NO_INV.
IF COMM_STRUCTURE-DOC_CATEG CA DEB_CRED.
   RESULT = RESULT * ( -1 ).
ENDIF.
RETURNCODE = 0.
  p_monitor[] = MONITOR[].
  CLEAR:
    MONITOR[].
ENDFORM.                    "routine_0002
FORM routine_0003
  TABLES
   P_MONITOR         structure rsmonitor
  CHANGING
    RESULT         TYPE _ty_s_TG_1_full-DEB_CRED
    RETURNCODE     LIKE sy-subrc
    ABORT          LIKE sy-subrc
  RAISING
    cx_sy_arithmetic_error
    cx_sy_conversion_error.
* init variables
* fill the internal table "MONITOR", to make monitor entries
  IF COMM_STRUCTURE-DOC_CATEG CA DEB_CRED.
    RESULT = 'C'.
  ELSE.
    RESULT = 'D'.
  ENDIF.
  RETURNCODE = 0.
  p_monitor[] = MONITOR[].
  CLEAR:
    MONITOR[].
ENDFORM.   
Error:
E:Field "COMM_STRUCTURE-NO_INV" is unknown. It is neither in one of the
specified tables nor defined by a "DATA" statement. "DATA" statement.
communication structure chaged to sours fields but no uses , please suggest how can i proceed , Thanks in Advance immediate replay
Thanks & Regards
Ramesh G

Hi Gareth,
You have two options:
1. Transport from BW 3.1 to BI 7.0. You'll need to create a transport route between both systems. This may cause you some troubles in the future when you want to modify the objects you transported.
2. As there are few objects, you can use XML export utility from Transport connection. There, you make an XML file with the objects you need to transport. One thing that you may take care of, in this option, is that the business content objects you are exporting need to be activated in the destination system. Another problem is that querys are not exported.
Since it's only a cube, maybe you can create the objects manually. Look that in BI 7.0 there are several new functionalities, i don't know how transport or xml export would work.
Hope this helps.
Regards,
Diego

Similar Messages

  • Migrate data flow from 3.5 to 7.3?

    Dear Experts,
    After technical had upgrade SAP BW from 3.5 to 7.3, I did test migrating data flow. I found that if I specified "migration project" to another name different from DataStore Object name, I could not find related objects (e.g. transformation or DTP) under that DataStore Object. And the DataStore Object was also inactive version, even the migration was done without error.
    For example
    - Original DSO name = AAA was showed inactive
    - Migration Project name = AAA_Migrated
    - After selecting all the objects including process chains and clicking on 'Migration/Recovery' button, status showed with no error (Migration History displayed all green)
    - recheck objects in transaction = RSA1
    - DSO name = AAA was still showed inactive
    I just wonder where all objects under DSO name = AAA were gone?
    What happened to the migration project name = AAA_Migrated?
    How should I find the migration project name = AAA_Migrated?
    How to recover all objects under DSO name = AAA? (Just in case misspelling "migration project")?
    If you have similar case mentioned above, could you share any experience how to handle this?
    Thank you very much.
    -WJ-

    BW 7.30: Data Flow Migration tool: Migrating 3.x flows to 7.3 flows and also the recovery to 3.X flow
    Regards,
    Sushant

  • Failed to disable contraints: Data Move when migrating data

    Get one "Failed to disable constraints: Data Move" error for each table that i try to migrate data for. Using 1.1.3.2766.
    * no other errors
    * no error detail available
    * no explanation given by sql-developer as to cause of problem.
    I'm wondering whether this is occuring because the migration process fails to migrate the primary keys and the constraints in the first place. Is it trying to disable something that it failed to create ?

    I am having the same problem.
    Version 1.2.1
    Build MAIN-32.13
    Oracle version: 10.2.0.3.0 - 64bit (on Solaris 5.10)
    Windows platform: XP Version 2002 Service Pack 2
    Is there any update on when this will be fixed, or are there workarounds?
    I have 2 (out of 1171) tables that are affected, and in both cases the first two rows are loaded and then it just stops. No errors in the errors column and the only thing in the log is Failed to disable constraints.
    Message was edited by:
    user591088

  • Error 140 when starting Data Flow service

    I installed Oracle BAM on Windows XP few days ago and it was working fine. Today I tried to start it but I get this error. It says: "Contact the service vendor and refer to service specific error code 140"
    BAM install guide says to check system event logs... I clear all logfiles in Event viewer and I checked and confirmed that the option " Overwrite event logfiles as needed" is selected.
    BAM Install guide says I need to start the TNS listener and Oracle Database service first. Looking in the Windows services, I don't have these 2 (and I have never had these). I do have Oracle Lite Multiuser, OracleMTR recovery service and Oracleoracle2client cache (not sure if this is for BAM though)...
    Additional info that may help:
    System Event has this:
    The Oracle BAM Data Flow Service service terminated with service-specific error 140 (0x8C).
    Application event viewer has this:
    Invalid DataMart Service License. Please call Sagent Technical Support. Invalid DataMart Server License: E04
    I am not clear what is this datamart service and why I am getting this error.
    Thanks

    Not sure what error you are hitting. Make sure you can reach your database from DOS prompt using tnsping or sqlplus commands. And your database version matches the supported db, BAM does not support OracleLite. Not sure what your earlier working condition was, (Did you use a remote DB)

  • Permissions when migrating data NTFS

    I have successfully migrated data from an old file server to a new file server. The file server will be commissioned soon pending an incremental ROBOCOPY.
    All data was migrated and from my spot check, all NTFS permissions are consistent.
    Are there any tools/techniques to further give me the assurance that all NTFS permissions are consistent with all the files and folders migrated?
    I have also included a robocopy line used as fyi
    robocopy \\pfdfs01\d$\data\d-drive d:\data /ZB /E /COPYALL /TEE /NP /R:1 /W:1 /LOG:c:\temp\filecopy\d-drive.log
    Vijay

    Hi,
    In additional to Robocopy, you could also use File Server Migration Toolkit to migrate NTFS permissions. For more detailed information, please refer to the article below:
    File Server Migration Toolkit
    http://technet.microsoft.com/en-us/magazine/2006.10.utilityspotlight.aspx
    Best Regards,
    Mandy 
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Photos app crashes when migrating data from iphoto

    Every time I open the photos app it crashes and the migration process of the data from iphoto doesn't complete on iPad 3, is there a solutio?.

    Have you tried repairing the Photos library - hold down and option and command keys and launch Photos
    LN

  • Issue with Data flow between Unicode and Non Unicode systems

    Hello,
    I have scenario as below,
    We have  a Unicode – ECC 6.0 and a UTF 7 – Legacy system.
    A message flow between Legacy system to ECC 6.0 system and the data is of 700 KB size.
    Will there be any issue in this as one is Unicode and other is non Unicode?
    Kindly let me know.
    Thanks & Regards
    Vivek

    Hi,
    To add to Mike's post...
    You indicate that your legacy system is non-Unicode and the ERP system is Unicode.  You also said that the data flow is only <i>from</i> the legacy system <i>to</i> the ERP system.  In this case, you should have no data issues, since the Unicode system is the receiving system.  There <b>are</b> data issues when the data flow is in the other direction: <i>from</i> a Unicode system <i>to</i> a non-Unicode system.  Here, the non-Unicode system can only process characters that exist on its codepage and care must be taken from sending systems to ensure that they only send characters that are on the receiving system's codepage (as Mike says above).
    Best Regards,
    Matt

  • Migrating data from Win 7 Pro to a 2012 MBP current OS

    Hi,
    I'm about to buy a MBP and wondered if i need to be aware of anything when migrating data to it from a WIndows 7 Pro amchine?
    All my data is held on an external HDD and i was hoping to be able to plug this in to the MBP and drag and drop on to the MBP hard drive.
    Any ideas if this will work or if anything else needs ot be done?
    Cheers,
    Dan.

    clintonfrombirmingham wrote:
    What? Where's all this malware on older Mac OS's?
    Apple handles Java for OS X and until just recently was slow to update Java when there was a update from Oracle.
    Apple has denied security updates for 10.5 since the release of 10.7, until just recently they issued a massive update.
    https://support.apple.com/kb/HT1222
    The reason for the "until just recently" is because a massive Flashback botnet of over 600,000 compromised Mac's exploiting Java vulnerabilites was discovered.
    https://en.wikipedia.org/wiki/Trojan_BackDoor.Flashback
    Apple has issued late  patches and malware removal for 10.5 Intel, 10.6 and 10.7, however none for 10.4 or 10.5 PPC which both 10.4 and 10.5 currently compose up to apx 20% of the OS X market share, it's not sure exactly how many are on PPC.
    Harden your Mac against malware attacks
    So you can see, a mere two years later and if one doesn't upgrade OS X, your screwed out of security updates, unlike Windows where one gets them for 10 years or more.

  • Data Flow Question in newly Migrated Business Content Install

    Hi, I just converted my Purchasing Business Content from 3.5 to 7.x.  This Content Loads the InfoCube directly from the Data Source.  After my DataFlow Migration I am left with still having an InfoSource. 
    Here is a picture of my Data Flow
    http://i55.tinypic.com/258b7zs.png
    I thought I would not have an InfoSource after all of this.  Is this correct?
    Thanks for any help with this...

    Hi, Kenneth,
    I beleive it's absolutley correct result after migration.
    I had the same thing with infosources when migrating 0sd_c03 and never had issues with that.
    Infosurces can be used in 7 data flow as sometimes it's a good thing to have additional transformation between source and target (allow for more flexibility in data transformation)
    By the way your infosurce also migrated as its a 7.x infosource now. I beleive it has a different object type compared to old 3.x infosurce. You can actually check the object type in se11, Tadir, select by your infosurce name.

  • Data Flow Migration without Infosource

    Hello All...
    I have installed a BI Content cube with the data flow. The data flow is in 3.5 ie, with transfer rules and update rules. I want to migrate the data flow into transformation and DTP. Is there any way I can convert the entire data flow into 7.0 dataflow without the info source. ie, I want only one transformation directly from Datasource to my cube. Is that possible using any tool, or i shud do that manually???
    Regards,
    BIJESH

    Hi,
    Have you migrated the update rule ?
    Once you have migrated the update rule, you will find the option of copying the transformation
    when u right click on the transformation.
    At that time you provide the source and target for the transformation.
    You can find the steps for migration in the link mentioned below.
    [Business Content Activation Errors - Transformations;
    I hope this helps.
    Thanks,
    Kartik

  • After migrating data from Time Machine, some of my photos are not showing up in iPhoto. I get a 'dashed rectangle." When I click on it I get ' ! in a Triangle" When I click on that, I actually can see the photo. I want to see my photos

    After migrating data from Time Machine, some of my photos are not showing up in iPhoto LIbrary view. I get a 'dashed rectangle." When I click on it I get ' ! in a Triangle" When I click on that, I actually can see the photo. I want to see all  my photos in 'Library' view, and I can't figure out how these photos seem to be arbitrarily hidden. Help, please.

    Try these for the first attempt:
    First be sure to have a backup copy of the library if you already don't have one.
    OT

  • How data flow when SSIS packages are run on a different server than the DB server

    The scenario is that i have a dedicated SQL SErver 2014 SSIS machine that executes the packages.
    The database server is a separate machine with SQL Server 2008R2.
    1) Running SSIS packages that transfer data within SQL Server 2008R2 (same machine)
    2) Running SSIS packages that transfer data between 2 separate SQL Server servers.
    How the data flow in these two cases and what resource is being used where? (cpu,disk,ram,network)
    Elias

    When you have a dedicated SSIS server, all data read flows to that server, is processed using the resources of that ETL server and then sent back over the network to the destination server.
    It doesn't matter if source and destination are the same server. If you use a data flow, all data flows over the network twice.
    The only exception is when you don't use a data flow, but only SQL statements. In that case, data flows only between source and destination.
    MCSE SQL Server 2012 - Please mark posts as answered where appropriate.

  • When we have LSMW for migrating data then why we will go for Session/Call ?

    Hi Guru's,
    Could you please tell me ...
    When we have LSMW for migrating data , why will go for Sessions/call Transaction for migrating? when we do with LSMW we can complete the object with less time then why we have will do with session/call transaction?
    when we have to use LSMW ? and when we have to use SESSION/ CALL TRANSACTION ..
    thanks in advance..
    vardhan

    LSMW can't upload large amount of data into the database.
    Whereas BDC SESSION CALL /TRANSACTION method can upload large amount of data into the database.
    The error capture method is superior in BDC.
    BDC programs can be scheduled to run on periodic basis as per the customer requirements.

  • Data filter when migrating Oracle to SQL

    How can I use a data filter (via t-sql or ??) when using SSMA to migrate data from Oracle to SQL Server ?

    Hi sqlworker,
    SQL Server Migration Assistant for Oracle are designed to migrate data from Oracle to SQL Server, we can convert and synchronize the schema of database, the particular category of the schema, and the attributes of objects. However, we can only use data filter
    to help you retrieve from a database table by using the WHERE clause to specify search conditions in SQL. According to your description, we need to verify if you just want to migrate some specific data from Oracle to SQL Server, or just use data filter in
    SQL Server after migration. You can post more information for analysis.
    For more information, you can review the following article.
    migrating
    data form Oracle to SQL Serve via SSMA.
    Filtering Data Retrieved from Database Table in SQL
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Automatically update Data Flow when table column is added

    Hi, I have identical SQL Server databases on Server A and Server B, and I'm trying to create an SSIS package that will update its own Data Flow task when I've added a new column.  
    When I open the SSIS package up in Data Tools after adding the column, I get the following warning: 'The external columns for OLE DB Destination are out of synchronization with the data source columns. The column "TestColumn" needs to be added to
    the external columns.'  So the package know when it's out of sync with the db.  
    Is there a way to automatically update the external columns?
    Thank you!

    There is a Dynamic DTS task commercial product of CozyRock
    Another possibility is to generate an updated version of the package programmatically either with BIML http://www.mssqltips.com/sqlservertip/3124/generate-multiple-ssis-packages-using-biml-and-metadata/
    or via .net code http://msdn.microsoft.com/en-ca/library/ms345167.aspx
    Arthur
    MyBlog
    Twitter

Maybe you are looking for