File Server Migration Source and Target Data Validation

Does anyone know of a power shell script/CLI command  or some other way to verify source and target data after a file server migration?  I want to make sure that the shares that are migrated from the source and target are an exact match. Thank
you.

Hi,
An example is provided in this article:
http://blogs.technet.com/b/heyscriptingguy/archive/2011/10/08/easily-compare-two-folders-by-using-powershell.aspx
$fso = Get-ChildItem -Recurse -path C:\fso
$fsoBU = Get-ChildItem -Recurse -path C:\fso_BackUp
Compare-Object -ReferenceObject $fso -DifferenceObject $fsoBU
And actually Robocopy could also do this job with /L and /log:file parameter. 
If you have any feedback on our support, please send to [email protected]

Similar Messages

  • How to create Same file names for source and target.

    hi
    Can any body send Procedure for below requirement.
    how to create the dynamic file names for a source and save the file with same name in the Target, because it has to identify that which sender had sent the file and the target file should be again sent back to the customer as a link.
    Please help me.
    Thanks

    Hi,
    See the below link
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14 - sender file name as receiver file name
    Regards
    Chilla

  • OWB 10g -- Can't Create Database Links for Data Source and Target

    We installed OWB 10g server components on a Unix box running Oracle 10g (R2) database. The Designer Repository is in one instance. The Runtime Repository and the Target are in another instance. The OWB client component was installed on Windows XP. We create a data source module and a target module in OWB. The data source is on another Unix box running Oracle 9i (R2) database. We try to create database links for data source module and target module, respective. But when we created and tested the DB links, the DB links were failed.
    For the database link of data source, we got the following error message:
    Testing...
    Failed.
    SQL Exception
    Repository Error:SQL Exception..
    Class Name: CacheMediator.
    Method Name: getDDEntryFromDB.
    Repository Error Message: ORA-12170: TNS:Connect timeout occurred
    For the database link of target , we got the following error message:
    Testing...
    Failed.
    API2215: Cannot create database link. Please contact Oracle Support with the stack trace and the details on how to reproduce it.
    Repository Error:SQL Exception..
    Class Name: oracle.wh.ui.integrator.common.RepositoryUtils.
    Method Name: createDBLink(String, String, String, String).
    Method Name: -1.
    Repository Error Message: java.sql.SQLException: ORA-00933: SQL command not properly ended.
    However, we could connect to the two databases (data source and target) using the OWB’s utility SQL Plus.
    Please help us to solve this problem. Thank you.

    As I said prior the database link creation should work from within the OWB client (also in 10).
    Regarding your issue when deploying, have you registered your target locations in the deployment manager and did you first deployed your target location's connector which points out to your source?
    I myself had some problems with database link creations in the past and I can't remember exactly what they were but it had something to do with
    - the use of abnormal characters in the database link name
    - long domain name used in as names.default_domain in my sqlnet.ora file
    What you can do is check the actual script created when deploying the database link so see if there's something strange and check if executing the created script manually works or not.

  • Source and target directory file name should be same

    Hi,
    How can i generate the same file name in target directory without date and Timestamp.
    for eg., Source File name : yeswanth.txt
    and target File name also : yeswanth.txt
    Note : here source side the file yeswanth.txt is constant and whenever it moves to target directory...should be overwrite.
    Reward points for useful answers.
    Thanks & Regards,
    yeswanth.

    Hi Mugdhal,
    am getting this error when i sent the same file again.
    Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException:
    Yeswanth.txt: Cannot create a file when that file already exists. : com.sap.aii.adapter.file.ftp.FTPEx: 550
    Yeswanth.txt: Cannot create a file when that file already exists.
    But, here it should be overwrite the existing file.
    Regards,
    Yeswanth.

  • Master Data ------  Source and Target Mapping

    I want to know the source and target mapping of some master data elements like plant,vendor,customer,workcenter.
    Where can I get it. Please provide the relevant documents or links.

    Hi,
    Check in RSOSFIELDMAP table.
    Thanks
    Reddy

  • LMDB - SLD "Source and target system have the same object server name"

    Hi,
    the system is a SM7.1 SP1
    For some reason (Object-Server-Name changed), the job that syncs the sld and lmdb cancels...
    I set the LMDB-Object-Server-Name to its original Name...
    Now i've deleted the sync-config in the solman_setup and try to start a new sync, but i get the following error, when i try to configure the source-system:
    'Source and target system have the same object server name: CISM318'
    But the names are different: 'SM3' (LMDB) and 'CISM318' (SLD)
    Any suggestions ?
    best regards
    Christoph
    Edited by: Christoph Bastian on Aug 26, 2011 9:37 AM
    Edited by: Christoph Bastian on Aug 26, 2011 11:18 AM

    problem solved...
    Apparently the sync needs some hours to finish a object-server-name-change...
    best regards
    Christoph

  • Content server migration - loadercli and I'm running nuts...

    We have an (old) content server database of 150 GB on 7.3.0.52 and we're trying to migrate that database to Linux x86_64 bit. I tried the following before:
    - tried to install/run 7.3 on SuSE SLES 11 SP1 - failing (new kernel threading)
    - tried to restore the 7.3 backup on 7.5 - failing (host pagaes too old)
    - tried to use 'loadercli' of 7.5 (Note 962019) - failing (ASCII --> Unicode)
    - now trying to use 7.6.06.20 and stuck
    I want to use pipes as transport and use loadercli on the target system (7.6).
    I created two files (according to note 962019)
    EXPORT USER
    catalog outstream pipe '/home/sqdcos/trans/COS.CAT'
    data outstream pipe '/home/sqdcos/trans/COS.DATA' PAGES
    package outstream file '/home/sqdcos/trans/COS.export'
    and
    IMPORT USER
    catalog instream pipe '/home/sqdcos/trans/COS.CAT'
    data instream pipe '/home/sqdcos/trans/COS.DATA' PAGES
    package outstream '/home/sqdcos/trans/COS.import'
    The pipes are not existing.
    I start the export which seems to work, the COS.CAT pipe is created.
    As soon as I start the import, I get the following error message:
    IMPORT USER
    catalog instream pipe '/home/sqdcos/trans/COS.CAT'
    data instream pipe '/home/sqdcos/trans/COS.DATA' PAGES
    package outstream '/home/sqdcos/trans/COS.import'
    // M    Execute   PACKAGE  to transform  CATALOG
    // M    Import    PACKAGE x'01000000A296EB4EB45800009B16031EC842BF0100000000'
    // M    Number of TABLES   transformed : 3
    // M    Processed command is a PAGES format based copy from database
    // with ASCII catalog to database with UNICODE catalog
    // M    Execute   PACKAGE  to transform  DATA
    // M    Number of TABLES   to transform: 0
    // E -25329:    The given data file '/home/sqdcos/trans/COS.DATA' was not
    // generated using EXPORT in PAGE Format (missing table description).
    plus I get an additional pipe created "COS.DATA0000"
    What am I missing here? I'm fiddling with this since hours and I can't figure what I'm doing wrong.
    Markus

    what is the source platform (just to be able to test here)?
    Source platform is SLES 9 32bit
    Target is SLES 11 SP1 64bit
    > 50% less data volume in the target sounds strange.
    it is, data is missing.
    > What does the loader.log say - source and target - anything suspicious?
    Not really, it looks "good":
    loadercli -d COS -n xx.xx.xx.xx -u SAPR3,SAP -b COS_EXPORT.sql
    Loader protocol: '/home/sqdcos/sdb/connd266/loader/log/loader.log'
    Loader packages: '/home/sqdcos/sdb/connd266/loader/packages'
    User SAPR3 connected to database COS schema SAPR3 on 191.1.1.29.
    EXPORT USER
    catalog outstream pipe '/home/sqdcos/trans/COS.CAT'
    data outstream pipe '/home/sqdcos/trans/COS.DATA' RECORDS
    package outstream file '/home/sqdcos/trans/COS.export'
    Successfully executed:
    Total number of tables (definition) exported: 3
    Total number of tables (data)       exported: 3 (excluded: 0, failed: 0)
    loadercli -d COS -u SAPR3,SAP -b COS_IMPORT.sql
    Loader protocol: '/home/sqdcos/sdb/connd266/loader/log/loader_2011121600202813.log'
    Loader packages: '/home/sqdcos/sdb/connd266/loader/packages'
    User SAPR3 connected to database COS schema SAPR3 on local host.
    IMPORT USER
    catalog instream pipe '/home/sqdcos/trans/COS.CAT'
    data instream pipe '/home/sqdcos/trans/COS.DATA' RECORDS
    package outstream '/home/sqdcos/trans/COS.import'
    Successfully executed:
    Total number of tables (definition) imported: 3
    Total number of tables (data)       imported: 3 (excluded: 0, failed: 0)
    Could/should we use a higher version on the target system?
    Markus

  • File Server Migration from 2008 Standard to 2012 Standard accross different subnet

    Hi
    Im going to migrate File server from Windows 2008 Standard server to Windows 2012 Standard . Source and Destination Servers are on different subnets . According to this
    http://technet.microsoft.com/en-us/library/jj863566.aspx I cannot use Server migrations tool in-built into 2012 .  Im not sure if I can use file server migration toolkit 1.2?.  
    Also my Domain controllers are mixture of Windows 2003, 2008 , 2008 R2 and I've upgraded the schema level to 2012 R2 . Is there anything else I need to be aware of ?
    Can anyone please recommended best way to go about doing this migration . Is file server migration toolkit 1.2 is compatible ?  .
    Only reason I don't want to use Robocopy to this is because If I miss a small setting etc then I will face unwanted downtime.
    I presume Migration toolkit will also create all the Quotas etc on the destination server .
    Thanks
    mumtaz

    Hi mumtaz, 
    We could use file server migration toolkit 1.2 to migrate file server between the two subnets. In order to maintain security settings after the migration, please ensure the security of files and folders after they are migrated to a target file server, the File
    Server Migration Wizard applies permissions that are the same as or more restrictive than they were on the source files and folders, depending on the option you select.
    In the meantime, quota cannot migrate by this tool but we can export and import the quota using dirquota command. Export the templates as xml and then import to new server:
    dirquota template export /file:C:\test.xml
    dirquota template import /file:C:\test.xml
    For more detailed information, please see:
    Template Export and Import Scenarios
    http://technet.microsoft.com/en-us/library/cc730873(WS.10).aspx
    Regards,
    Mandy
    If you have any feedback on our support, please click
    here
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • File server migration with Offline files involved

    Hi,
    We are planning a file server migration in following weeks.
    This morning, our customer came with the good old "Ow, and I just thought about something else"
    Here's the scenario :
    -They are using 1 network drive
    -That network drive is made offline available for all laptop users
    -Those users are spread out in several country's. No VPN connection
    -They are working for months on their offline network drive, right in the middle of the wood, no internet connection, it was already short for them to find power supply for their laptop ...
    ...nevermind
    -The day they come back to the office, the file server to which points the network drives will be offline.
    Now the 1 Million question : What happens with their "dirty" files ?
    yep exactly. those they changed 6 months ago, have no clue about if you ask them but certainly will the day I will clear the damn cache.
    My first analysis :
    -The new file server will have another name, no alias or re-using the old name is possible (the customer don't want to)
    -I can't tell to those laptops "hey for that offline cache, please use this new network drive"
    So :
    >> Those users have to identify manually files they changed while being offline, copy them locally on their machine and work that way the time they come back to the main office.
    >> When they finally show up, clear the cache, offline the new network drive and replace file copied locally
    >> If no internet connexion available in the branch office, let them work locally, it's still better than this hybrid-non-sense 6month offline folder "solution". If internet connexion is mainly available remotely, propose some Citrix/View/RDS
    Setup which is, for me, a more professional looking solution
    Someone has another (better?) idea/solution ?

    Hi, 
    I suggest you ask users to collect their laptop to internet, then start offline files synchronization on the old file server. After that, use
    Robocopy to copy the date from the old server to the new server. As the offline files cache cannot be recognized by the new file server, so we need to synchronize data first.
    If the older server cannot be enabled, as you mentioned, you might need to ask users to copy their changed files to the new file servers.
    Regards, 
    Mandy
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Can source and target tables be the same

    Hi there,
    I need to restate profit center column in the delivery fact table. Historical Profit Centers need to be populated since today is the first time we brought in Profit Center field from LIPS table. The source for the delivery fact table is SAP LIPS table. so what I did is:
    joined the LIPS and delivery fact table on (LIPS.delivery number = deliveryfacttable.delivery number & LIPS.delivery line item number = deliveryfacttable.delivery line item number) to get the profit center.
    So the source and the target delivery fact table are the same. Is this a good practice becuase Data services warns me that the source and target tables are the same.
    Please let me know a better alternative approach to this OR a better approach to restate fields for historical data. Thanks in advance.
    Regards,
    samqiue

    Arun,
    actually the LIPS resides on another server and in order to fetch data I have to write an R3 ABAP data flow, so I cant use it directly in a lookup.
    Except what Im thinking based on your reply to use a lookup table is: that I will extract 3 colums (delivery number, delivery line item number and profit center) to a table and then use the lookup.
    Thanks.
    Regards,
    samique

  • Error saving map. Stored procedure returned non-zero result. Check if source and target schemas are present.

    I am using VS 2012 and BizTalk 2013 and attempting to deploy an application to BizTalk when I get these errors:
    Error 47
    at Microsoft.BizTalk.Deployment.Assembly.BtsMap.Save()
       at Microsoft.BizTalk.Deployment.Assembly.BtsArtifactCollection.Save()
       at Microsoft.BizTalk.Deployment.Assembly.BtsAssembly.Save(String applicationName)
       at Microsoft.BizTalk.Deployment.BizTalkAssembly.PrivateDeploy(String server, String database, String assemblyPathname, String applicationName)
       at Microsoft.BizTalk.Deployment.BizTalkAssembly.Deploy(Boolean redeploy, String server, String database, String assemblyPathname, String group, String applicationName, ApplicationLog log)
    0 0
    Error 49
    Failed to add resource(s). Change requests failed for some resources. BizTalkAssemblyResourceManager failed to complete end type change request. Failed to deploy map "XXX.BTS2013.XXX.Maps.map_XXXX_R01_InsLabProc".
    Error saving map. Stored procedure returned non-zero result. Check if source and target schemas are present. Error saving map. Stored procedure returned non-zero result. Check if source and target schemas are present.
    0 0
    Error 46
    Failed to deploy map "XXX.BTS2013.XXX.Maps.map_XXXX_R01_InsLabProc".
    Error saving map. Stored procedure returned non-zero result. Check if source and target schemas are present.
    0 0
    I also tried to Import a MSI file from our test environment to see if that would work...got the same errors.  After spending hours (not kidding) looking for an answer, all I could find is that a hotfix would work.  So, I got the hotfix from Microsoft
    Support and applied it then rebooted.  Still getting the same errors.  I'm absolutely at a stand still.  Interesting that I got this application to deploy yesterday and then the next time I deployed it I started getting these errors.  I'm
    ready to pull my hair out!
    Is there an answer for this out there somewhere?  Any help would be appreciated.
    Thanks,
    Dave

    Hi Dave,
    Which hotfix have you applied? I don't think a hotfix of this issue is available for BizTalk 2013 yet. You should create a
    support ticket with Microsoft to get a solution.
    If this answers your question please mark as answer. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Best practice deployment - Solaris source and target

    Hi,
    What is the recommended deployment guide for an ODI instance under Solaris. I have a sybase source and an Oracle target, both of which are on Solaris. I plan to put my ODI master and work repository on another Oracle DB on the solaris target machine. Now where does my agent sit, since my source and target are solaris ?? I plan to administer ODI from my windows clients but :-
    Where and how do I configure my agent so that I can schedule scenarios. It would make most sense to be able to run the agent on my target solaris machine , is this possible ?? If not then do I have to have a separate windows server that is used to run the agent and schedule the jobs ??
    Thanks for any assistance,
    Brandon

    thanks for the reply. I cant find anything in the installation guide about Solaris specifically but it mentions to follow the instructions for "Installing the Java Agent on iSeries and AS/400" where the download o/s is not supported.
    So it seems I just need make some directories on the solaris host and to manually copy files into s these directories and as long as java sdk/runtime is there I can use the shell scripts (eg. agentshceduler.sh ) to start and stop the agent.
    So my question I guess is since the os supported downloads are only windows and Linux, where do I copy the files from, the Linux ones ?? Is it right to say that since these are java programs I should be able to copy the Linux ones and use them under Solaris ??
    I dont have the Solaris environment at hand to test this just yet.... hence the questions....
    thanks again

  • Control center location source and targets

    Hi,
    I'm trying to document an OWB 11.2 repository and I'm trying to find a OWBSYS view that holds the assigned source and target locations for a control center in the design repository (what you see in the control center properties in the Locations tab) . Does anyone know where I can query this information?
    Cheers,
    John

    Perhaps the table is repurposed between 10g and 11.2?
    When I query that table I see a list of modules (information_system_name) and it's associated location (location_name) but I do not know if the location is a source or a target. A lot of these tables have been around since 9i so I guess in 9i speak I want to see the equivalent of the module type ('data source' or 'warehouse target').
    In 11.2 the module no longer has this source or target concept directly via the module properties - it held against the control center properties - not that all_iv_control_centers shows this information - that is just a list of control center names and their associated remote server/workspace/workspace owner properties - which is similar to the old 9i runtime repository connection properties when using the deployment manager and pretty much the same as what you see when querying all_iv_runtime_repositories.
    So, still not quite there yet...
    Cheers,
    John

  • The source and target structure have a different number of fields

    Hi,
    I am new to workflow and I am trying to create an attachment in Workflow (SAP ECC 6.0) and pass it through to a User Decision (User Decision works fine) however the workflow is failing on the attachment step with u2018The source and target structure have a different number of fieldsu2019. The bindings check ok. Please see details below.
    I have used document u2018Creating Attachments to Work Items or to User Decisions in Workflowsu2019 by Ramakanth Reddy for guidance. Thanks in advance.
    1) Workflow containers (SWDD)
    WORKITEMID (import)
    ZSWR_ATT_ID (export)
    SOFM (export)
    2) Task Container (PFTC)
    1 Import parameter defined u2013 WORKITEMID (swr_struct-workitemid)
    2 Export parameters defined
    - SOFM (Ref. obj. type SOFM)
    - ZSWR_ATT_ID  (swr_att_id-doc_id)
    Binding task -> Method
    Binding for 1 parameter (import) defined
    Task <- Method
    Binding for 2 parameters (export) defined
    3) Z  BOR object created with a Method, Method Parameters and Event (SWO1)
    1 import parameter defined
    2 export parameters defined
    Method calls FM SSF_FUNCTION_MODULE_NAME, CONVERT_OTF, SCMS_XSTRING_TO_BINARY and SAP_WAPI_ATTACHMENT_ADD
    Workflow is triggered by FM SAP_WAPI_CREATE_EVENT, Return_code = 0
    Event_id = 00000000000000000001
    Test results
    A) Triggered by ABAP/ FM SAP_WAPI_CREATE_EVENT - SWI2_DIAG results
    Work item  14791: object <z bor object name> method <method name> cannot be executed. The source and target structure have a different number of fields (this message is repeated 3 times). Error handling for work item 14791. No errors occurred -> details in long text (message is repeated 3 times).
    Message no. WL821, OL383, WL050 in long text
    B) Z BOR Test method <execute>
    Enter workitem id.
    Runtime error - Data objects in Unicode programs cannot be converted. The statement "MOVE src TO dst" requires that the operands "dst" and "src" are convertible. Since this statement is in a Unicode program, the special conversion rules for Unicode programs apply.                                        
    In this case, these rules were violated.   
    Program                             SAPLSWCD                
    Include                                LSWCDF00                
    Row                                    475                     
    Module type                        (FORM)                  
    Module Name                      MOVE_CONTAINER_TO_VALUE           
    C) Z BOR Test method <debugging>
    Enter workitem id.
    SAP_WAPI_ATTACHMENT_ADD, return_code = 0, message_lines  = Attachment created            
    both  swc_set_element container work ok
    Runtime error occurs after end_method executed. Data objects in Unicode programs cannot be converted.
    D) Workflow test
    Enter workitem id <execute>
    Task started> Workflow log> Status = Error
    Workflow errors in Attachment step (however Office document can be viewed in details for step).

    Problem has now been resolved. Problem was related to use of swr_att_id structure and swc_set_element statement in BOR program - problem resolved by only setting w/f container to swr_att_id-doc_id.

  • One problem with constraints missing in source and target database....

    DB Gurus,
    In my database I am going to export and import data using datapump but dont specify CONTENT clause.
    so when I start working on my application, I came to know there are some constraints are missing. now how to check what are the constraints are missing from Source database and how to create those missing constraints on Target database?
    Suggest me if you have any idea.

    Create a database link between source and target schema.
    Use all_/dba_/_user_constraints to generate the difference using MINUS operator.
    something like this:
    select owner,constraint_name,constraint_type from dba_constraints@source
    MINUS
    select owner,constraint_name,constraint_type from dba_constraints

Maybe you are looking for

  • How can i re-link .mpkg and .pkg files to hidden Installer?

    I am near the end of the process of setting up my new Mac Pro (early 2009) eight core. Quite the wonderful difference from my "Mirror Drive Door" dual 1.25 GHz machine! The Mac Pro came with Mac OS X 10.6.2 installed. I updated this to 10.6.6 via Sof

  • Configuring HP-Openview to monitor alerts of CCMS in SAP system via SOLMAN

    Hi All, I need to configure CCMS in SOLMAN and then monitor the same alerts via HP-Openview because thats what is used in my company to monitor alerts.I have a few queries regarding the same: 1. Is it possible to configure CCMS in SOLMAN and then mon

  • Is there a danger using Flash Player?

    I head rumours about Adobe Flash. Does this include Adobe Flash Player for the Mac?

  • ....Date Type in Oracle

    Hi All, Just to clear a few misconception/ambiguities regarding the DATE data type in Oracle. We hve a table which stores the date of insertion of a record. The front end application is in PHP while backend is in oracle. When i query the table, it sh

  • Serialize a value object that implements Comparator T

    Hi. I'm implementing a VO like this: public class RegionVO implements Serializable, Comparable<RegionVO>My problem is that i need that the class could be serialized completely. The VO will be transfered throught EJB's and by past experiences with EJB