Cross-Empire Data Migration - DST?

We are still evaluating NSM 3.0.4 Cross-Empire Data Migration....
It looks like the migration doesn't work with Dynamic Shadow Technology (DST) shadow volumes....
I thought that seeing as the NSM Engine and/or Agent doing the migration has to have the Novell client installed that it would be able to migrate from both the primary and shadow volumes...
However, our initial testing shows that it is ignoring the shadow based data...
Is this an oversight? Use of incorrect/outdated API?
Has anyone else any experience of this? We have rather a larger amount of data in our shadow volumes....!
Cheers
David

djbrightman,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Visit http://support.novell.com and search the knowledgebase and/or check all
the other self support options and support programs available.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://forums.novell.com)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://forums.novell.com/faq.php
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://forums.novell.com/

Similar Messages

  • Cross-Empire Data Migration - Mirror?

    Evaluating the edir-AD Cross-Empire Data Migration process...
    Is there a copy option that allows the mirroring of source data to the target?
    i.e. the removal of files in the target that no longer exist in the source
    We are looking for a staged migration i.e. allow early migration jobs to copy the bulk of the data in the days/weeks preceding the actual migration, with final migration set to mirror/sync the source and target
    Hope this makes sense
    Thanks
    David

    djbrightman,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://forums.novell.com/

  • Cross Empire Data Migration missing characters (utf8?)

    Hi. We are ready to migrate over 10000 homedirectorys from our novell/netware plattform to Windows 2008R2.
    I use Storage manager 3.0.3.9 with novell client 2 SP2 for windows server 2008R2.
    Netware 6.5 SP7 on Source systems.
    When i migrate homedirs i lose all in my files and foldernames.
    If i do a copy instead with my novell client in windows explorer on the same server i got complete names. So it must be something in the Migration phase?
    Any clues?
    Rgrds
    Stefan

    Stefan,
    Send an email with a runtime config to storagemanager at novell.com. We will take a look.
    Thanks,
    NSM Development
    >>> Brillko<[email protected]> 4/13/2012 7:36 AM >>>
    Hi. We are ready to migrate over 10000 homedirectorys from our
    novell/netware plattform to Windows 2008R2.
    I use Storage manager 3.0.3.9 with novell client 2 SP2 for windows
    server 2008R2.
    Netware 6.5 SP7 on Source systems.
    When i migrate homedirs i lose all åäö in my files and foldernames.
    If i do a copy instead with my novell client in windows explorer on the
    same server i got complete names. So it must be something in the
    Migration phase?
    Any clues?
    Rgrds
    Stefan
    Brillko
    Brillko's Profile: http://forums.novell.com/member.php?userid=10163
    View this thread: http://forums.novell.com/showthread.php?t=454615

  • Cross Empire Migration IRFs

    When performing a Folder to Folder migration (eDir to AD) all of my data migrates successfully. My top level folder on the target receives full rights (which were assigned in eDir). However the sub-folders do not have the IRFs that were assigned in eDir. Is this how the migration works in transferring rights or am I doing something wrong.
    Thank you!

    On 03.10.2013 18:26, kbannister wrote:
    >
    > When performing a Folder to Folder migration (eDir to AD) all of my data
    > migrates successfully. My top level folder on the target receives full
    > rights (which were assigned in eDir). However the sub-folders do not
    > have the IRFs that were assigned in eDir. Is this how the migration
    > works in transferring rights or am I doing something wrong.
    It transfers all rights which are _set_ on files/folders on NSS side -
    it does not handle IRFs (those don't exist in NTFS similarly..).
    Sadly the documentation doesn't really mention this!
    Atleast this was the case with v3.0.4 when we did the migrations.
    But, it does give you the list of IRFs that are used on the selected
    data when you look at Preview Report for example, so you know to check
    those up manually after the migrations.
    sk

  • HT2518 I migrated from a PC to a Mac Book Pro.  The contacts transferred over to the Mac along with many contacts that I had deleted sometime ago. Also some contacts did not transfer at all.  How can I perform a correct and up to date migration of my cont

    I migrated from a PC to a Mac Book Pro.  The contacts transferred over to the Mac along with many contacts that I had deleted sometime ago.  Also some contacts did not transfer at all. How can I perform a correct and up to date migration of my contacts? 

    well no reply from the community. sniff- do i stink!? - probably a little bit. If i have lost all that data then I'm gonna really really pong.
    Any ways i have purchased a 1tb usb HDD and have managed to duplicate the dmg file onto it. however still unable to mount the **** thing. so now i have ordered disk warrior which will be in my possession tomorrow.
    fingers toes eyes crossed.

  • Data migration via MigGUI - questions with Sync

    Due to some issues we encountered (logistics) with file compression/quotas, here's what I'd LIKE to do, but not sure if it can be done:
    1) Create a miggui project called like: VOL1
    2) But when I go in to select the items to be migrated, I don't want to select the entire volume, but rather specific "top level" directories.
    3) Start the migration during business hours
    4) Then when we're ready, kick everyone off and do a final "sync" to get the items that were missed (due to being open or whatever) the first time.
    The questions:
    a) Can I have multiple data migrations within the same project but each one defined as a specific directory? Or will I need to make multiple projects for that?
    b) The Sync only relies upon date/timestamp right? But on OES (with NSS) there's like 3 timestamps I think, so which one does it use/compare against?
    c) We've seen issues with people's GroupWise Archives where the miggui will see new dates/times in their Archive (index files, etc.) but a Beyond Compare CRC check will show the data is actually the same.

    Originally Posted by Ramesh
    kjhurni wrote:
    > Hi Oza,
    >
    > A few problems:
    >
    > 1) You cannot select multiple file systems, so if you only want to
    > migrate things one at a time, you have to create multiple projects in
    > order to do that.
    Multiple File systems? What is your source server? Traditional and NSS
    volumes can be dragged and dropped within the same project. On Linux,
    NCP and NSS volumes can also be done in the same project. More info on
    this please...
    >
    > 2) The file timestamps are incorrect and off by one hour when the
    > miggui runs. The server OS time and date DO show correctly, but the
    > times in the miggui in the logs and the files it's actually writing to
    > the NSS volumes are all off by one hour.
    If your source server is NetWare 5.1 then there was a bug in tsa500.nlm
    which caused files on the destination to be off by an hour. The fixed
    tsa500.NLM can be found here:
    NOVELL: Downloads
    If your source is not NetWare 5.1, then some questions:
    1. All the files and directory migrated have their timestamp off by 1
    hour? So on src if a file has modified time as 10-Oct-2009 12:32 on
    destination after migration the file has 10-Oct-2009 11:32?
    If yes, then we'll have to investigate more...
    2. The miggui logs being off by 1 hour looks like DST (Daylight Savings
    Time) settings are not applied on JVM running in your server. Because
    miggui uses Java api to get the local time which in-turn gets it from
    JVM that uses the server time. So in any case, the time stamps should
    not be different, since we uses only local time.
    You can perform one check to confirm the DST is applied on JVM or not.
    Download TimeTest-20070923.jar from IBM java site,
    IBM - Guidance on updating Java SDKs and JREs for Daylight Saving Time (DST) changes
    and execute "java -jar TimeTest-20070923.jar" at console and verify with
    server time.
    Rgds
    Ramesh
    Multiple File systems meaning when you open the migration utility and click the Add button, an item appears called: File System. You can only have ONE of those per project. What I had wanted was to be able to put multiple "file systems" in the project so that it would do one at a time, so that if we had problems on one, we could re-run just that one item. Instead, I have to create multiple projects with a File System item added that does a specific top-level folder. It works, just a lot more tedious with keeping multiple projects and having to constantly re-authenticate.
    Source server is NW 6.5.7
    I've got an SR open.
    The log files in the miggui filesystem.log are off by one hour (but not on the other two servers we've done). When I looked in Windows Explorer at the file creation times (when it migrated directories and whatnot), those were also off by one hour.
    But the system time showed correctly, so that's where we're trying to figure out why and what went wrong.
    First Novell tech said it was because we had set hardware clock to Local Time, not UTC (but my other two servers are set to local time and didn't do this).
    Just thought it was odd that the system time would show one thing, but the file system would show something different. But it's been escalated to backline so hopefully we'll figure it out.
    Tech support was also under the impression that the SYNC operation in miggui did not sync files, only metadata. That would be bad if that's the case.

  • Fixed Asset Data Migration Wizard

    Hi All,
    I tried to use Fixed Asset Data Migration Wizard to migration my fixed asset data from 8.82 to 9 in the development server.  The server only accessible by myself.
    When I start the migration run, system prompt me " To Continue with the fixed asset data migration, you must be the only user logged on to this company data".
    In fact I'm the only user in the company data and system not allow me to execute the migration.
    What are the causes that system recognise there is another user in the system?
    Please advise.
    Thanks in advance.
    Best Regards,
    Foong Yee

    Hi,
    If you're using SQL Server, try Detach (close existing connections) and re-Attach the database again then re-login and try executing again the wizard.
    Regards,
    Donald

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Data migration of open production order

    Hello SAP gurus,
    Please advise me about the steps that should be followed for data migration of open production order from legacy to SAP.
    Regards,
    Anand

    Hello SAP gurus,
    Please advise me about the steps that should be followed for data migration of open production order from legacy to SAP.
    Regards,
    Anand

  • Data migration

    Dear all my friends,
    please help to share me about data migration before go-live FI system.
    i plan to migrate GL acct balance, AP/AR open item and last 3 months historical data.
    please help to show me : how should i do ?
    thanks !

    1. Normal GL Accounts: In this case line items are not required and totals are to be updated in SAP. This can be done using GL fast entry if the number is not too high.
    2. Open Item Managed GL Accounts: All the open line items are required to be uploaded in SAP from Legacy in order to clear them. Hence, a LSMW / BDC program has to be developed to upload the line items.
    BAPI_ACC_DOCUMENT_POST - for GL/AP/AR
    BAPI_ACC_INVOICE_RECEIPT_POST - for AP
    BAPI_ACC_GL_POSTING_POST - for GL
    GL Balance
    F-02 or RFBIBL00 or BAPI
    Dr GL a/c Credit Data migration Account or Clearing Account
    3. Customer & Vendor Open Items: All open line items in Customer and Vendor Accounts have to be brought in into SAP. For this also a LSMW / BDC program needs to be developed and upload the line items.
    Vendor Balances
    F-02 or RFBIBL00 or BAPI
    Credit Vendor a/c (individually) and Dr Data migration Account or Clearing Account
    Customer balances
    F-02 or RFBIBL00 or BAPI
    Dr Customers a/c (individually) and Dr Data migration Account or Clearing Account
    4. Asset Accounting: In case asset accounting is implemented the individual balances of each asset (Gross Value and Accumulated Depreciation) have to be uploaded using transaction AS91. For this also you need to develop a LSMW / BDC program. Further, you also need to update the GL Accounts related to Asset Accounts with totals for each asset class T.Code OASV.
    For uploading the Assets
    AS91 -> for uploading assets in AA
    OSAV --> for uploading assets in GL entry Dr Asset (individually) Cr Data migration Account or Clearing Account
    You can use LSMW for uploading all these things
    After uploading all, your data migration account will become zero

  • Confuse on PR & PO data migration approach new to MM module

    Hi All,
    I'm pretty confuse with the PO data migration approach when it comes to PO is GR or partial GR or GR&IR. I'm hoping that someone can enlighten me. i understand that we typically don't migrate PO when it is GR & IR, FI team usually will bring over to the new system as an vendor open item in AP. How about the PR or PO which have gone through or half release strategy? What is the follow up process?  I have created a criteria table below. How Could someone point me in the right direction? Thanks in advance.
    PR
    Criteria
    Data migration required
    Notes
    Open and Released
    Y
    Open and not Released
    Y
    Flag for Deletion
    N
    Flag for Block
    Y
    PO
    Criteria
    Data migration required
    Notes
    Open and Released
    Y
    Open and not Released
    Y
    GR but no IR
    GR & IR
    N
    AP will bring over as open item
    Flag for Deletion
    N
    Flag for Block
    Y
    Partial
    Y
    For partial GR to recreate PO only with missing GR quantity
    Regards,
    John

    Hi John,
    The approach that i have followed recently is that we have considered PO as the base document and converted other documents based on the PO condition. This means you first need to see if the PO is to be converted or not. Then you can proceed to convert the related documents like PR, Agreement, Info record, Source list, etc.
    Also open qty for PO should be considered for Material and Service line items both.
    Once a GR/SES is created, it gets updated in the PO history table EKBE with its related transaction/event type.i.e. EKBE-VGABE = 1 for GR and 9 for SES. Quantity and value also gets updated in case of material and services. You can compare this consumed quantities with PO quantity.
    Please see below from SCN and let me know if you need more info on PR or PO conversion.
    Purchase Requisition Conversion Strategy
    Thanks,
    Akash

  • Data Migration of FI-CA Cleared Items

    Hi,
    Currently I am working on a Data Migration project to migrate data from our FI-CA  module on ERP 4.7 to ECC 6.0.
    THere has been a request from the business to migrate historical data (e.g. Cleared items).
    Is there a SAP recommended approach or tools to load this data into the target environment?
    Currently all documentation around SAP data migrations talks about stategies for open item data migration however I have seen nothing arouund migrating historical financial data.
    Is this because it is not recommended or technically impossible?
    Regards
    Adam Gunn

    That BAPI is used typically for straight out vanilla GL, AR, AP postings, however you still have to create the other side of the entry and then clear it.
    I need to be able to migrate full history, which means from an FI-CA  viewpoint:
    1. Migrate FI-CA posting of liability against BP/contract account.
    2. Migrate associated payments.
    3. And then clearing documents.
    Basically the requirement is to represent the historical data in the new system as if it was posted and matched off.
    Is there a technical way to do this?
    OR,
    Do you migrate the FI-CA liabilties, then the associated payments and then run clearing on the target system?
    I suspect this is almost an impossible data migration requirement as development of the extraction and load process would be extremely complex and testing would take months to cover all posting scenarios in FI-CA. However, I would be interested if anyone has attempted to do this before.
    Adam

  • Data Migration from SAP r/3 to SAP R/3

    Hi all,
    What is the best method to migrate data from one verison of SAP to another verison of SAP ECC say from SAP 3.1 to SAP ECC 6.0?
    This is for all SAP modules including sd, mm, pp, fi, co master and transaction data.  I know there are number of technologies to load the data such as LSMW, IDOC, ALE, DI, etc but what is the best way to extract all data feeds to be loaded back into SAP.
    Thanks in advace.

    Take a look to the following link, may be useful for you
    SAP NetWeaver - Data Migration (CA-DMI) [original link is broken]
    if helpful reward points are appreciated

  • Data migration from legacy system to SAP

    hi all,
      could you please post some docs on migration from legacy system to SAP system., also the problems faced while migration and best practices in data migration
    regards
    sridharan

    Hi ,
    I require few details from you.
    1. What ETL tool you are using, If Informatica, it already have PowerConnect to connect to SAP. So you can create source and Target Structure and also you can use RFC's to send data to R/3. Else, for other ETL tools, can you prepare RFC's or any other way to send data to R/3. let me know the tool.
    2. Does R/3 contains the master data tables? If yes, then try to use LSMW for Mass upload of data to tables.
    If your client don't want to use either of these options please elaborate, what is the case.
    Regards
    Aashish Sinha

  • Data Migration from 11i to R12 Global - Open POs,lines, receipts & on hand upload, Is it possible to do the onhand qty upload with over riding of all receipts which uploaded against Open PO lines?

    Hi Friends,
    We are in a phase of data migration from 11i to R12 
    I was discussed with client & they wants extraction of all open POs which was generated after 01 Jan 2014 to till date in 11i.
    Condition for open POs is PO qty-received qty=>0
    critical Example for open PO is :PO no: 10 has 4 lines, 3lines full qty has been received & for 1 line partial qty(say 50 out of 100) received.
    in this case he wants in R12 uploading as PO no:10 should entered as open PO with all 4lines & 3 lines complete receipt should be done, for 4th line partial qty i.e 50 should be received.
    the question is if we upload on hand qty first, then open POs & receipts, it will increase the onhand qty in new system(mismatch of on hand qty's 11i to R12) 
    Is it possible to do the onhand qty upload with over riding of all receipts which uploaded against Open PO lines.
    Or Please advice best solution.
    Thanks & Regards
    Giri

    adetoye50 wrote:
    Dear Contacts Journal Support Team,
    FYI, this is a user to user support forum.  You are NOT addressing Apple here.
    Honestly, I doubt anyone is really going to take the time to read the novel you have written.

Maybe you are looking for

  • How can I get my mail to open and read documents in Firefox after upgrading to Sony Vaio Gatew 2.4.0

    I upgraded my Sony Vaio Gate software to 2.4.0.06210 on Wednesday night. Afterwards, my mail functionality is not working with Firefox. It will work with Internet Explorer, but not with Firefox when I try to do the following such as clicking on check

  • BEx Web Application or Query as iView in the Enterprise Portal

    Hi, I am integrating BW System in to Enterprise Portal. the following link shows http://help.sap.com/saphelp_nw04/helpdata/en/ab/9659400d9d1059e10000000a155106/frameset.htm Sequence of Steps We recommend that you execute the necessary steps in the fo

  • Where can I find an xdocbook?

    As suggested by the respected members of this forum, I am dispensing with my Unstructured templates and will try to create Structured templates in xdocbook. Stupid question: Is this an application, a plug-in, or a blank template that I can modify? Wh

  • ALV displays a maximum of 130 characters for field

    Hi all! I have a problem with the ALV display and Debug textbox. I have a var which has 220 of length but alv only displays the first 130. Also, debugging, this var is showed in the textbox with the first 130 characters, but if I write l_message+100

  • St text

    Hi all I want to know the steps to include standard text (created in so10)in script. Thanks