WS2012 Hyper-V CSV (Volume3 appeared as Folder)

OS version: WS2012 Std
Cluster type: CSV
Total num of Node: 2
Total of LUN's: 6 (MBR)
Connector type: FC
Number of VMs: 11
Enable MPIO: Yes
Server model: HP ProLiant DL980 G7
Antivirus: N/A
======================================================================
Hi everyone recently I setup Windows Hyper-V 2012 with CSV. I configured my Hyper-V with Live migration setting. For the LUNs I configured as below:
LUN1: 500GB / 2VMs / WS2008 R2
LUN2:500GB / 2VMs / WS2008 R2
LUN3:500GB / 2VMs / WS2008 R2
LUN4:500GB / 2VMs / WS2008 R2
LUN5:500GB / 3VMs / WS2012 Std
LUN6: 10GB / Quorum
My problem is, when the CSV complete to add storage to CSV, I realize in the Node 2 under C:\ClusterStorage\Volume, one of volume appeared as folder but the rest volume look likes okay (directories icon). When I tried to access that volume3 at Node 2 the
message appear I don't have permission to access volume3.
Node 1
-Volume1 /(directories icon)
-Voleme2 /(directories icon)
-Volume3 / (directories icon)
-Volume4 /(directories icon)
-Volume5 /(directories icon)
=====================================================================
Node 2
-Volume1 /(directories icon)
-Voleme2 /(directories icon)
-Volume3 / (Folder icon) -Weird!!
-Volume4 /(directories icon)
-Volume5 /(directories icon)
When I performed Live migrations for both node only Volume3 got problem migrate to one node to another node. (VMs under volume3 unable to start)
Appreciated to all expert to answer my concern.
Thanks.

Hi,
Each node should have proper permission for C:\slusterstorage folder, and then the node can read or access data in CSV.
Check the Node 2 settings, make sure you did not disable SMB on the CSV network, also make sure following network protocol are enabled:
Client for Microsoft Networks
File and Printer Sharing for Microsoft Networks
You may also remove the storage on node 2, then re-add it and check the result.
Check event log to find CSV related logs and post it.
For more information please refer to following MS articles:
Cluster Shared Volume Functionality
http://technet.microsoft.com/en-us/library/ee830309(v=ws.10).aspx
Unable to start Cluster Service - Event ID 5123 ClusterStorage Access is denied
http://social.technet.microsoft.com/Forums/en-US/winserverClustering/thread/3baa96d6-8047-4479-b4cf-4f75bfa7c26b
CSV: Not accessible from the second node
http://social.technet.microsoft.com/Forums/en-US/windowsserver2008r2highavailability/thread/5b0f981e-3714-4280-97b1-73ffe31913f6
Lawrence
TechNet Community Support

Similar Messages

  • My iPod no longer appears as  folder/file in Itunes when connected.  Library, genius, playlists, etc.  are present, but there is no longer cross reference with the iPod itself.  Help!?

    My iPod no longer appears as  folder/file in Itunes when connected.  Library, genius, playlists, etc.  are present, but there is no longer cross reference with the iPod itself.  Help!?

    I just tried it again and this time a different message showed up "The iPod 'name' cannot be synced. The required folder cannot be found." I have no clue what folder it could be talking about.

  • Hyper draw doesn't appear unless you set Hyperdraw/Articulation ID

    From the last 2 or 3 version Logic has an annoying behaviour on score window: Hyper draw doesn't appear unless you set Hyperdraw/Articulation ID/Any or None...
    This procedure has 3 minus points:
    1. The editing is slowed down
    2. Every time you close and reopen the score window YOU HAVE TO REDO IT!!!!
    3. What is it for?.....
    It was better before...

    For Radeon 7000, you need to:
    pacman -S xf86-video-ati
    You need to load those modules in /etc/rc.conf:
    MODULES=(agpgart via-agp)
    And in xorg.conf:
    Driver      "radeon"

  • Bought a new HD for my Mac mini, and now i can not do anything, i press com r and nothing appears, only gray screen and then appears a folder with a ? in it, can anyone help me how to solve this. Thank you

    ought a new HD for my Mac mini, and now i can not do anything, i press com r and nothing appears, only gray screen and then appears a folder with a ? in it, can anyone help me how to solve this. Thank you

    Thank you so much for your attention, it is allready working well.

  • Hyper V CSV not working on 1 server

    hello
    we are running a 3 server hyperv setup with windows server 2008 r2 datacenter.
    we are using cluster shared volumes for these 3 servers.
    the servers are connected to the storage via 2 methods...  isci  and a sas cable.
    we created 4 cluster disks and this setup worked for about a year.  all three servers see the csv... live migrates work.. among the three servers and its all good.
    about three weeks ago.. 1 server just stopped seeing the csv drives/folder.
    when u click on the csv folder in the local drive it freezes explorer... even though on that bad server all the validations check out.. no errors just a few warnings..
    in the event viewer for the storage we see timeout to the cluster and then bad network path errors...
    but network wise everything pings out and has no issues..
    after multiple restarts and updating drivers and storage management stuff... it still 2 that works and this last one doesnt see it for no good reason..
    can anyone help?

    Hi jackie ,
    If you are running  Microsoft Forefront Client Security on the cluster node , please check if there are hyper-v exclusions.
    Please refer to following link :
    http://support.microsoft.com/kb/2011727
    Also please check the the cluster event log in failover cluster manager snap-in , for details please refer to following link :
    http://technet.microsoft.com/en-us//library/cc772342.aspx
    If there are logs such as Event ID: 5120 , Event ID generated: 5142 , please regarding to the link below :
    http://support.microsoft.com/kb/2008795
    Hope it helps
    Best Regards
    Elton Ji
    Best Regards
    Elton Ji
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • DPM 2012 Backing up a VM on Server 2012 Hyper-V CSV Host - Not Working with Hardware VSS

    Hi All,
    I'm trying to backup a VM on a 2012 Cluster. I can do it by using the system VSS provider, but when I try to use the hardware provider, (Dell equalogic) it doesn't work. DPM will sit for a while trying and then report a retryable VSS error. 
    The only error I'm seeing on the Host is the following:
    Event ID 12297Volume Shadow Copy Service error: The I/O writes cannot be flushed during the shadow copy creation period on volume \\?\Volume{3312155e-569a-42f3-ab3a-baff892a2681}\. The volume index in the shadow copy set is 0. Error details: Open[0x00000000, The operation completed successfully.
    ], Flush[0x80042313, The shadow copy provider timed out while flushing data to the volume being shadow copied. This is probably due to excessive activity on the volume. Try again later when the volume is not being used so heavily.
    ], Release[0x00000000, The operation completed successfully.
    ], OnRun[0x00000000, The operation completed successfully.
    Operation:
    Executing Asynchronous Operation
    Context:
    Current State: DoSnapshotSet
    I don't know where to go from here - There is no activity on the CSV (this is the only VM on it, and both the CSV and VM were created specifically for testing this issue
    Does anyone have any ideas? I'm desperate. 
    Update:
    Ok, so I can Take DPM out of the picture. Trying to do a snapshot from the Dell Auto-Snapshot manager, I get the same errors. But I also get a bit more information:
    Started at 3:02:47 PM
    Gathering Information...
    Phase 1: Checking pre-requisites... (3:02:47 PM)
    Phase 2: Initializing Smart Copy Operation (3:02:47 PM)
    Adding components from cluster node SB-BLADE01
    Adding components from cluster node SB-BLADE04
    Adding components from cluster node SB-BLADE02
    Retrieving writer information
    Phase 3: Adding Components and Volumes (3:02:52 PM)
    Adding components to the Smart Copy Set
    Adding volumes to the Smart Copy Set
    Phase 4: Creating Smart Copy (3:02:52 PM)
    Creating Smart Copy Set
    An error occurred:
    An error occurred during phase: Creating Smart Copy
    Exception from HRESULT: 0x80042313.
    Creating Smart Copy Set
    An error occurred:
    An error occurred during phase: Creating Smart Copy
    Exception from HRESULT: 0x80042313.
    An error occurred:
    Writer 'Microsoft Hyper-V VSS Writer' reported an error: 'VSS_WS_FAILED_AT_FREEZE'. Check the application component to verify it is in a valid state for the operation.
    An error occurred:
    One or more errors occurred during the operation. Check the detailed progress updates for details.
    An error occurred:
    Smart Copy creation failed.
    Source: Creating Smart Copy Set
    An error occurred:
    An error occurred during phase: Creating Smart Copy
    Exception from HRESULT: 0x80042313.
    An error occurred:
    Writer 'Microsoft Hyper-V VSS Writer' reported an error: 'VSS_WS_FAILED_AT_FREEZE'. Check the application component to verify it is in a valid state for the operation.
    An error occurred:
    One or more errors occurred during the operation. Check the detailed progress updates for details.
    Error: VSS can no longer flush I/O writes.
    Thanks,
    John

    I had a similar issue with an environment that had previously been working with the Dell HIT configured correctly. As we added a third node to the cluster I began seeing this problem.
    In my case I had the HIT volume max sessions per volume at 6 and maximum sessions per volume slice set to 2 and the CSV was using a LUN/Volume on the SAN that was split across 2 members.
    When the backup takes place and Dell HIT is configured to use SAN snapshots the vss-control iSCSI target is used which in my case exceeded my limits for maximum connections per volume as I'm using 2 paths per Hyper-V node with MPIO (this is my
    current theory).
    Once I'd modified these settings I could then back up the VHD's on that CSV again.
    Hope this helps.

  • How to display data from local csv files (in a folder on my desktop) in my flex air application using a datagrid?

    Hello, I am very new to flex and don't have a programming background. I am trying to create an air app with flex that looks at a folder on the users desktop where csv files will be dropped by the user. In the air app the user will be able to browse and look for a specific csv file in a list container, once selected the information from that file should be displayed in a datagrid bellow. Finally i will be using Alive PDF to create a pdf from the information in this datagrid laid out in an invoice format. Bellow is the source code for my app as a visual refference, it only has the containers with no working code. I have also attached a sample csv file so you can see what i am working with. Can this be done? How do i do this? Please help.
    <?xml version="1.0" encoding="utf-8"?>
    <mx:WindowedApplication xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" width="794" height="666">
        <mx:Label x="280" y="19" text="1. Select Purchase Order"/>
        <mx:List y="45" width="232" horizontalCenter="0"></mx:List>
        <mx:Label x="158" y="242" text="2. Verify Information"/>
        <mx:DataGrid y="268" height="297" horizontalCenter="0" width="476">
            <mx:columns>
                <mx:DataGridColumn headerText="Column 1" dataField="col1"/>
                <mx:DataGridColumn headerText="Column 2" dataField="col2"/>
                <mx:DataGridColumn headerText="Column 3" dataField="col3"/>
            </mx:columns>
        </mx:DataGrid>
        <mx:Label x="355" y="606" text="3. Generated PDF"/>
        <mx:Button label="Click Here" horizontalCenter="0" verticalCenter="311"/>
    </mx:WindowedApplication>

    Open the file, parse it, populate an ArrayCollection or XMLListCollection, and make the collection the DataGrid dataProvider:
    http://livedocs.adobe.com/flex/3/html/help.html?content=Filesystem_08.html
    http://livedocs.adobe.com/flex/3/html/help.html?content=12_Using_Regular_Expressions_01.ht ml
    http://livedocs.adobe.com/flex/3/html/help.html?content=dpcontrols_6.html
    http://livedocs.adobe.com/flex/3/langref/mx/collections/ArrayCollection.html
    http://livedocs.adobe.com/flex/3/langref/mx/collections/XMLListCollection.html
    If this post answered your question or helped, please mark it as such.

  • SSIS 2008 – Read roughly 50 CSV files from a folder, create SQL table from them dynamically, and dump data.

    Hello everyone,
    I’ve been assigned one requirement wherein I would like to read around 50 CSV files from a specified folder.
    In step 1 I would like to create schema for this files, meaning take the CSV file one by one and create SQL table for it, if it does not exist at destination.
    In step 2 I would like to append the data of these 50 CSV files into respective table.
    In step 3 I would like to purge data older than a given date.
    Please note, the data in these CSV files would be very bulky, I would like to know the best way to insert bulky data into SQL table.
    Also, in some of the CSV files, there will be 4 rows at the top of the file which have the header details/header rows.
    According to my knowledge I would be asked to implement this on SSIS 2008 but I’m not 100% sure for it.
    So, please feel free to provide multiple approaches if we can achieve these requirements elegantly in newer versions like SSIS 2012.
    Any help would be much appreciated.
    Thanks,
    Ankit
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello Harry and Aamir,
    Thank you for the responses.
    @Aamir, thank you for sharing the link, yes I'm going to use Script task to read header columns of CSV files, preparing one SSIS variable which will be having SQL script to create the required table with if exists condition inside script task itself.
    I will be having "Execute SQL task" following the script task. And this will create the actual table for a CSV.
    Both these components will be inside a for each loop container and execute all 50 CSV files one by one.
    Some points to be clarified,
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform data insert, while for the rest 48
    files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have repeated file names. How can we manage
    this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged with older than criteria, say remove
    data older than 1st Jan 2015. what is the best way to achieve this requirement?
    Please know, I'm very new in SSIS world and would like to develop these packages for client using best package development practices.
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform
    data insert, while for the rest 48 files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    How can you identify these files? Is it based on file name or are there some info in the file which indicates
    that it required a purge? If yes you can pick this information during file name or file data parsing step and set a boolean variable. Then in control flow have a conditional precedence constraint which will check the boolean variable and if set it will execute
    a execte sql task to do the purge (you can use TRUNCATE TABLE or DELETE FROM TableName statements)
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have
    repeated file names. How can we manage this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    The best way to achieve this is to append a sequential value to filename (may be timestamp) and then process
    them in sequence. This can be done prior to main loop so that you can use same loop to process these duplicate filenames also. The best thing would be to use file creation date attribute value so that it gets processed in the right sequence. You can use a
    script task to get this for each file as below
    http://microsoft-ssis.blogspot.com/2011/03/get-file-properties-with-ssis.html
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged
    with older than criteria, say remove data older than 1st Jan 2015. what is the best way to achieve this requirement?
    You can use a SQL script for this. Just call a sql procedure
    with a single parameter called @Date and then write logic like below
    CREATE PROC PurgeTableData
    @CutOffDate datetime
    AS
    DELETE FROM Table1 WHERE DateField < @CutOffDate;
    DELETE FROM Table2 WHERE DateField < @CutOffDate;
    DELETE FROM Table3 WHERE DateField < @CutOffDate;
    GO
    @CutOffDate which denote date from which older data have to be purged
    You can then schedule this SP in a sql agent job to get executed based on your required frequency
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Long number in CSV file appearing in scientific notation by default

    Hi,
    How can I stop a long number in a CSV file that is opened in excel from appearing in scientific notation by default?
    eg.
    "hello","778002405501 ", "yes"
    becomes:
    hello | 7.78002E+11 | yes
    I have tried wrapping the data in quotes in the csv but to no avail.
    Thanks in advance,
    Alistair

    You can change the extension from ".csv" to ".xls" and use table to form the data and use
    style=”mso-number-format:\@;”
    Please read the sample code below in Classic ASP:
    You can also read in my blog http://sarbashish.wordpress.com/2012/11/30/export-to-excel-how-to-prevent-long-numbers-from-scientific-notation/
    <%
    Response.Clear
    Response.CacheControl = “no-cache”
    Response.AddHeader “Pragma”, “no-cache”
    Response.Expires = -1
    Response.ContentType = “application/vnd.ms-excel”
    Dim FileName
    FileName = “TestDB Lookup-” & month(now)&”-”&day(now)&”-”&year(now)&”.xls”
    Response.AddHeader “Content-Disposition”, “inline;filename=” & FileName
    %>
    <html xmlns:o=”urn:schemas-microsoft-com:office:office” xmlns:x=”urn:schemas-microsoft-com:office:excel” xmlns=”http://www.w3.org/TR/REC-html40″;>
    <head>
    <meta http-equiv=Content-Type content=”text/html; charset=UTF-8″>
    <!–[if gte mso 9]>
    <xml>
    <x:ExcelWorkbook>
    <x:ExcelWorksheet>
    <x:WorksheetOptions>
    <x:DisplayGridlines/>
    </x:WorksheetOptions>
    </x:ExcelWorksheet>
    </x:ExcelWorksheets>
    </x:ExcelWorkbook>
    </xml>
    <![endif]–>
    </head>
    <body>
    <table border=”0″>
    <tr>
    <td>ID</td>
    <td>Name</td>
    </tr>
    <tr>
    <td style=”mso-number-format:\@;”>01234567890123456567678788989909000030</td>
    <td>Sarbashish B</td>
    </tr>
    </table>
    </body>
    </html>
    Sarbashish Bhattacharjee http://sarbashish.wordpress.com

  • Hard drive icon appears as folder icon in dock

    When I drag the hard drive icon for any of my hard drives (Mac Pro) to the dock, a folder icon rather than hard drive icon appears in the dock. It functions properly, accessing the full drive, but is harder to distinguish from the other folders in the dock.
    Why does this happen? Is there any way to get the hard drive icon to appear in the dock?

    Welcome to Apple Discussions,
    Why does this happen?
    Directories (like the Hard drive) show up as folders when added to the dock.
    Is there any way to get the hard drive icon to appear in the dock?
    You bet. Make an alias to the Hard Drive (you can put it inside the hard drive itself if you like). Then drag the alias to the dock. It will show up as the drive.
    Hope the helps.

  • Script to list Hyper-v CSV disks and their LUNs

    Hi all
    I am getting to grips with powershell now but one thing i am struggling to do is create a script that will query CSV's volumes and show their SAN volume LUNs.
    Have searched frantically and pieced together something, which uses two WMI queries to gather the information and then use the signature to match them up. However, some CSVs do not come back with a signature. 
    Now surely to match a CSV with the corresponding physical disk and thus its LUN should be easy. Looks like lots of people have tried to crack this with WMI queries and even using diskpart. But their has to be an easy way. We are running 2012 Hyper-v
    and 2012 R2 clusters. So a solution for either platform would be good.
    Thanks

    Hi Chris Ryan,
    To get the CSV and their LUNs , you can start the
    Failover Clusters Cmdlets  in powershell, please check the script below:
    Getting More Information About You Cluster LUN’s
    Best Regards,
    Anna

  • Sent Items with Attachments in iCloud Do Not Appear in Folder

    When I send an email with attachment in iCloud, the e-mail does not show up in the Sent folder but the recipient receives the e-mail. PLease help.

    Hello there,
    In regards to the Artists problem, make sure that the songs or album(s) are not marked as part of a compilation in iTunes. This happens most often with compilations that include several artists such as sound tracks or greatest hits albums.
    1. To fix this problem, locate the songs or album in iTunes and right->click (control->click on a Mac) and choose Get Info from the menu.
    2. When the window pops up, head over to the Info tab.
    3. Make sure there is no check mark next to “part of a compilation”, which can be located in the lower right hand corner of the window.
    4. If there isn’t a check mark, put one there and immediately remove it, just to be sure its not marked.
    5. From there, click OK, and re-sync the updated songs or album to your iPod and see if that helps.
    For the podcasts, try this:
    Locate and highlight one of them from your iTunes library, right->click on it and choose Get Info from the menu. When the window pops up, head over to the Options tab and make sure the media type is set to Podcast. Then try resyncing your iPod to update the changes made and see if that helps.
    B-rock

  • Sent items don't appear in folder 'sent items' in Applemail (IMAP)

    Dears,
    My Apple mail is configured to be used with the IMAP-protocol (I use Belgacom Skynet here in Belgium). Everything works fine, except there is a very inconvenient situation that my sent-items "disappear": they dont pop-up in the folder  "sent-items". To be clear: mail get sent away- that is verified, but they aren't stored.
    I searched the Apple forums already and noticed this is- unfortunatelyagain - a problem that happens frequently. The best I coudl find is creating a rule and BCC the mails to your inbox....
    Also marking the sent-items folder on the server (via menu "use this postbox for sent-items) does not solve the problem.
    Many thanks,
    Peter

    Check Mail > Preferences > Accounts > Mail behaviors section
    as well if your mail provider supports web mail - check its settings, you may have some rules to move sent mail to another folder, delete it etc...

  • Bridge Randomly Changed Appearance of Folder Names...

    I searched for an answer to this problem for 10 minutes, but I couldn't find anything. So, here I am.
    Ok, so, my copy of Bridge CS5 4.0.0.529 used to show the names of my folders very similar to how Finder does. Meaning that the name of the folder used to just be right under the folder itself.
    Well now, all of a sudden and for no apparent reason, Bridge puts the name of my folder in a "bubble" like label OVER the folder. You may be thinking, "Wow, this guy is way too picky." Well, that's not the major problem. The major problem is that Bridge decided to do it for SOME folders, and not others. So, to find a folder, I now have to select one, and scroll left or right until I find the folder I want as it won't give me the name of the folder unless it is selected.
    I'm losing my mind!
    It will, however, show the name of all of the folders if I make the icons at the bottom, HUGE, but that's stupid. I don't want to make them huge. I prefer to have them small, and my images large. I also refuse to use Bridge in any mode besides Filmstrip. For me, it's the best way to use Bridge.
    Can anybody help me get it back to the way it was?
    Please?
    Thank you so much for your time and help.

    GOT IT! Just went up to "output" and hit "reset standard workspaces".
    If I may add a few observations:
    The behavior you are showing in the screenshots is typical for having set the content panel to 'show thumbnail only'. It is under the view menu and might also have occurred with hitting accidentally the cmd/ctrl + T key combination.
    Resetting workspace also set's the thumbnail info per your setting in the Bridge preferences settings for Thumbnail/details. Just try it once to check
    Another mayor point is that you did not ran the updater end still have the first version instead of the latest release 4.0.5.11, You will benefit from that update, be sure to have also the ACR 6.6 plug in installed, the 6.5 plug in did cause some bugs.
    I also believe (not sure, long time ago) that the updater solved the problem you are seeing with some folders with and some without names.

  • Migration of VMs from WS2012 Hyper-V Hosts Cluster to WS2012 R2 Hyper-V Hosts Cluster

    Hello All,
    We’re currently running our production VMs on a Failover Cluster of Windows Server 2012 Hyper-V Hosts. We’re planning to migrate these VMs to the cluster of
    Windows Server 2012 R2 Hyper-V Hosts.
    I have created a failover cluster of Windows Server 2012 R2 Hyper-V Hosts, and successfully tested the HA of my new test VMs on this new cluster.
    Anyone please tell me the procedure, steps and best practices to migrate these VMs from Windows Server 2012 Hyper-V Hosts to Windows Server 2012 R2 Hyper-V
    Hosts.
    Thank you.
    Regards,
    Hasan Bin Hasib

    Hi Hasan Bin Hasib,
    You can refer the following related cross-version live migration KB:
    Hyper-V: Migration Options
    https://technet.microsoft.com/en-us/library/dn486792.aspx?f=255&MSPPError=-2147217396
    I’m glad to be of help to you!
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

Maybe you are looking for