Zone transfer between 2003 and 2008

Hi I am new in windows servers and I am studying about it , While doing practical of DNS, I am unable to transfer dns zones from  server 2008 to server 2003. Server 2008 has installed AD and DNS, 2003 server also have installed DNS but it is just
connected to 2008 and its not part of domain.
Is it important to make secondary or additional domain controller to 2003 server  of 2008 sever before transferring dns zones?

Hi,
According to your description, it seems that it is an AD-integrated zone.
An AD Integrated is stored in the AD database, and the zone will replicate to other domain controllers within the same replication scope automatically as part of the
AD replication process. By default, AD integrated zones are configured to not allow zone transfers. Allowing zone transfers is an option provided to support non-DC DNS servers, BIND or any other name brand DNS server that you want to allow zone transfers to
a secondary on those servers.
In your case, if you want to make the Windows Server 2003 as an additional DC, then zone transfer is not needed. If not, you can configure zone transfer and add the
IP address of the Windows Server 2003 to the zone transfer tab in the properties of the zone in DNS console.
Best regards,
Susie
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

Similar Messages

  • Secure the file/data transfer between XI and any third-party system

    Hi All,,
    I would like to use to "secure" SSH on OS Level the file/data transfer between XI and any third-party system Run OS Command before processing and OS command After processing. right now my XI server installed on iSeries OS.
    with ISeries we can't call the Unix commands hope we need to go for AS400 (CL) Programming. If we created the AS400 programm how i can call that in XI.
    If any one have idea pls let me know weather it will work or not.
    Thanks in adavance.
    Venkat

    Hi,
    Thanks for your reply.
    I have red some blogs like /people/krishna.moorthyp/blog/2007/07/31/sftp-vs-ftps-in-sap-pi to call the Unix Shell script in XI.
    But as i know in iSeries OS we can write the shell script we need to go for AS400 programe. If we go with AS400 how we need to call that programe and it will work or not i am not sure there i need some help please.
    Thanks,
    Venkat

  • Remote desktop connection limit in windows 2003 administration as well as in the mixed environment of windows 2003 and 2008 servers

    RDP protocol i.e. Remote desktop connection is configured to perfrom and manage  software administration of ORACLE application and database servers which runs on windows 2003 server.   Two sessions are allowed on each of these servers for database
    administrators. The question is: 
    a) if network administrators who perform window server administration (50+) are included in
    2 sessions limit or do they manage all these servers through Console Session which is separate from the remote desktop connection limits of 2 sessions.  
    b) How is the 2 sessions  limit prescribed by microsoft (more of a licensing limit) handled in the mix environment of windows 2003 and 2008 server where all these servers are managed on the VMWARE?
    avnish sharma

    Hi Avnish,
    Thank you for posting in Windows Server Forum.
    By default any windows server will provide 2 remote session for administration purpose only. No matter which administrator is accessing that server. If you will connect the console session then 3.One server is accessed by 3 Session (console + Remote +
    Remote). When the particular server reached this limit then any working administrator will receive a message to log out as other user trying to access the session or if we had provided the setting then new user is restricted to login.
    If you want more than 2 remote desktop session than you need to purchase TS\RDS CAL, install TS\RD Licensing role, activate it first and then configure CAL on it. There are 2 types of CAL available (USER & DEVICE). You can purchase CAL according to your
    company requirements.
    Hope it helps to understand!
    Thanks.
    Dharmesh Solanki
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Password Policy - Mixed servers 2003 and 2008

    I Need help!!!!
    So this is my situation. I'm trying to enforce a Company Wide Password Policy via GPO but running into problems. We have no current Password Policy in place (This is the only one). I'm attempting to use the default global policy in Server 2008 and I'm
    testing the GPO on a specific security group, but does not seem to work. It will prompt to change the password, but the other requirements aren't being enforced.
    This is what I'm trying to enforce.
    Expire after: 90 days
    Complexity: Enabled
    Cant reuse last: 12 password
    Lockout time: 15 minutes
    Lock out after: 5 attempts
    Minimum of :8 characters
    Infrastructure: We have a mix of 2003 and 2008 servers. I'm using our 2008 server to enforce the GPO.
    Once I apply the GPO to a specific security group, it will prompt to change the password for the users in that group, but will not enforce all the other policies. This is a major project and we cant deploy this policy all at once (Helpdesk wouldn't
    be able to handle the call volume) so we decided to deploy it by departments/Security groups. We also tried
    We also tried using a fine-grained password policy but just like the GPO, it was only enforcing the password change aspect and not the other requirements like a minimum of 8 characters. Can any help!!!!

    > What if I apply the GPO on the domain root level, and then in the
    > delegation tab, exclude certain groups until we are ready for it to
    > apply to that department?   Will hat work?
    No. Read again - in 2003, there is ONE password policy for the DOMAIN,
    not for individual accounts.
    Technically this works the following way: Password policies are picked
    up by every member computer. But on these, password policies only apply
    to LOCAL accounts, not to domain accounts.
    On the other hand, there are Domain Controllers. The PDC emulator is the
    only one of these that will pick up Password policies - and only if they
    are linked to the domain. And so, these apply to all "local" accounts on
    the PDC, which in fact are the domain accounts.
    Martin
    Mal ein
    GUTES Buch über GPOs lesen?
    NO THEY ARE NOT EVIL, if you know what you are doing:
    Good or bad GPOs?
    And if IT bothers me - coke bottle design refreshment :))

  • Transfer between iPod and iTunes - mission impossible !

    I recently got iPod Touch (latest model with latest iOS) - just to use it in my car. I simply wanted to add and remove few songs. In a normal world, there would be just 2 columns (ipod/computer and simple managing both parts within files and folders management - let us be sincere, can anything be more natural and straightforward ? Sure not). But Apple wants it to make super complicated and highly frustrating. OK, what to do,
    Since I already have a different computer (newer version of MBP Retina, with the latest OSX) , I needed to get the iPod content to the iTunes on the new computer somehow. This is impossible as such - since iTunes keep telling that if I want to associate this iPod with this iTunes library , I have to erase the iPod (this is something I really don't want). I try "full backup" - as offered by iTunes ... I do it few times, the process starts, but always lasts quite short (considering 1700 songs with 24 GB on my iPod). The result is - nothing is backuped to this computer ... very nice ...
    In further search, I discover some third party apps, that supposedly could help with this terrible process. I bought AnyTrans and started transferring the content to iTunes. in 9 hours, it was just 4 GB transferred ... (overnight) ... obviously not the way ... On another advice , I bought iExplorer, it should be supposedly better. I tried to transfer the iPod content to iTunes ("start automatic transfer). I tried it 5 times. As a result, the app always froze and for quit was the only solution. NIce. Then I got an advice that it is good not to transfer to iTunes but just to some folder on a computer. That finally worked. Good
    Then I added all these songs to iTunes Library (finally some first result after many hours of effort). Then I agreed that this iPod can be associated with this iTunes Library (that already contained those 1700 songs) and let it synced with this iPod.  After some time the iPod was indeed erased. No music in it any more. As much as I tried again and again to sync it with iTunes Library, the process went in three short steps and finished. The result - no music on iPod, whatever I do ... Full music Library in iTunes on MBP
    What shall I say and still keep my expressions decent ? (difficult) This iTunes and this system of working with files on iPod or iPhones is for me the most stupid invention ever. How could someone ever create such a terrible nonsense that make simple things impossible and highly frustrating ? Instead of simple file transfer between devices ... I have no words for that ... It is also possible that it is me who is stupid, in that case, please let me know how can I transfer songs from iPod to iTunes and how can I how some songs on iPod (that are already in iTunes Library). Not to waste computer space by having iPod library copied there, is another thing, but let us not go to details now

    Yes, all played in iTunes in the computer, right boxes checked (see the printscreens) ...  In the end , I restored to restarting iPod and MBP. After that it started working (synchronisation). But only up to some extent. In iTunes Library I have about 1700 tracks, but only 660 were transferred to iPod. Now I made: "add to library" - adding to library all the song I already have in Library: and the iTunes started working hard to add the rest in the Library (but they were already there ...)
    So finally I spend nice whole day to add/remove songs in my iPod ... 
    Another thing is - when in iTunes Library I want to add/change genre to some songs, the process takes so long, it is like each song is freshly copied just to add the tag "classical" etc to each ... I thought I have the "fastest Mac ever" ... Waiting so long for such trivial process is unbelievable ... (with rainbow circle making a nice decoration now and then)

  • Most efficient data transfer between RT and FPGA

    This post is related to THIS post about DMA overhead.
    I am currently investigating themost efficient way to transfer a set of variables to a FPGA target for out application.  We have been using DMA FIFOs for communications in both directions (to and from FPGA) but I'm recently questioning whether this is the most efficient approach.
    Our application must communicate several parameters (around 120 different variables in total) to the FPGA.  Approximately 16 of these are critical meaning that they must be sent every iteration of our RT control loop.  The others are also important but can be sent at a slightly slower rate without jeopardising the integrity of our system.  Until now we have sent these 16 critical parameters plus ONE non-critical parameter over a DMA to the FPGA card.  Each 32-bit value sent incorporates an ID which allows the FPGA to demultiplex to the appropriate global variables on the FPGA.  Thus over time (we run a 20kHz control loop on the RT system - we have a complete set of paramaters sent @ approx. 200Hz).  The DMA transfers are currently a relatively large factor in limiting the execution speed of our RT loop.  Of the 50us available per time-slot running at 20kHz approximately 12-20us of these are the DMA transfers to and from the FPGA target.  Our FPGA loop is running at 8MHz.
    According to NI the most efficient way to transfer data to a FPGA target is via DMA.  While this may in general be true, I have found that for SMALL amounts of data, DMA is not terribly efficient in terms of speed.  Below is a screenshot of a benchmark program I have been using to test the efficiency of different types of transfer to the FPGA.  In the test I create a 32MB data set (Except for the FXP values which are only present for comparison - they have no pertinence to this issue at the moment) which is sent to the FPGA over DMA in differing sized blocks (with the number of DMA writes times the array size being constant).  We thus move from a single really large DMA transfer to a multitude of extremely small transfers and monitor the time taken for each mode and data type.  The FPGA sends a response to the DMA transfers so that we can be sure that when reading the response DMA that ALL of the data has actually arrived on the FPGA target and is not simply buffered by the system.
    We see that the minimum round-time for the DMA Write and subsequent DMA read for confirmation is approximately 30us.  When sending less than 800 Bytes, this time is essentially constant per packet.  Only when we start sending more than 800 Bytes at a time do we see an increase in the time taken per packet.  A packet of 1 Byte and a packet of 800 Bytes take approxiamtely the SAME time to transfer.  Our application is sending 64 Bytes of critical information to the FPGA target each time meaning that we are clearly in the "less efficient" region of DMA transfers.
    If we compare the times taken when communication over FP controls we see that irrespective of how many controls we write at a time, the overall throughput is constant with a timing of 2.7us for 80 Bytes.  For a small dedicated set of parameters, the usage of front panel controls seems to be significantly faster than sending per DMA.  Once we need to send more than 800 Bytes, the DMA starts to become rapidly more efficient.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    So to continue:
    For small data sets the usage of FP controls may be faster than DMAs.  OK.  But we're always told that each and every FP control takes up resources, so how much more expensive is the varsion with FP controls over the DMA.
    According to the resource usage guide for the card I'm using (HERE) the following is true:
    DMA (1023 Elements, I32, no Arbitration) : 604 Flip-Flops 733 LUT 1 Block RAM
    1x I32 FP Control: 52 Flip-Flops 32 LUTs 0 Block RAM
    So the comparison would seem to yield the following result (for 16 elements).
    DMA : 604 FLip-Flops 733 LUT 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    We require more FLip-Flops, less LUTs and no Block RAM.  It's a swings and roundabouts scenario.  Depending on which resources are actually limited on the target, one version or the other may be preferred.
    However, upon thinking further I realised something else.  When we use the DMA, it is purely a communications channel.  Upon arrival, we unpack the values and store them into global variables in order to make the values available within the FPGA program.  We also multiplex other values in the DMA so we can't simply arrange the code to be fed directly from the DMA which would negate the need for the globals at all.  The FP controls, however, ARE already persistent data storage values and assuming we pass the values along a wire into subVIs, we don't need additional globals in this scenario.  So the burning question is "How expensive are globals?".  The PDF linked to above does not explicitly mention the difference in cost between FP controls and globals so I'll have to assume they're similar.  This of course massively changes the conclusion arrived to earlier.
    The comparison now becomes:
    DMA + Globals : 1436 Flip-Flops 1245 LUTs 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    This seems very surprising to me.  I'm suspiscious of my own conclusion here.  Can someone with more knowledge of the resource requirements differences between Globals and FP controls weigh in?  If this is really the case, we need to re-think our approach to communications between RT and FPGA to most likely employ a hybrid approach.
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

  • Question about transfer between oracle and sql server

    Could i program to transfer lots of data between Oracle and SQL Server quickly?
    I have tried make two connection between two databases, but it took me lots of time to transfer data.
    How could I do it?
    Thanks

    Hi,
    If you need to move data fast, then use the Oracle Migration Workbench to Generate SQL Server BCP data extraction scripts and Oracle SQL Loader files.
    This is the fastest way to migrate the data.
    In the Oracle Model UI tab of the Oracle Migration Workbench, right mouse click on the tables folder. there is a menu option to 'Generate SQL Loader ...' scripts. This will help you migrate your data efficiently.
    Regards
    John

  • File transfer between Mac and Windows

    I am running Windows on my Mac and I transfer files between mac and Windows alot using my external hard drive. I was wondering if there was an easier way to do this than having to use my external hard drive like a networking solution or something. Thanks...

    i also run windows on my mac (for work). The issue i had was that when booted in osx, i could see the windows partition, and copy files FROM it to the mac side, but i could not copy files TO the windows partition.
    As i understand it, this is because the apple NTFS driver (to read the windows partition) only allows read access (not read and write access). It was easily solved with MacFuse (which is an open source way to add different disk file systems to your mac, kind of like teaching it a new language), and NTFS-3G for mac osx.
    http://macntfs-3g.blogspot.com/
    scroll down a bit for the download link. The installer will put in MacFuse and NTFS-3G, so it's the only one you need.
    After a reboot, i can read/write to the windows partition while booted into macosx (and it's free).
    i hope that helps.
    p.s. the efi boot manager download is NOT required.

  • Novell DNS setup zone transfer to a window 2008 server

    DNS is on our OES and we have a window server off site that needs to have DSN on it so that the client can login when the T1 is down. I need directions on how to set this up. We currently have the following problem. DNS is not syncing with the window server. So, we do a reload from master and all the folder appear by the next moring it no longer is working. It maybe setup correctly and we may need to look at other options.
    sharon

    Originally Posted by canavanslc
    DNS is on our OES and we have a window server off site that needs to have DSN on it so that the client can login when the T1 is down. I need directions on how to set this up. We currently have the following problem. DNS is not syncing with the window server. So, we do a reload from master and all the folder appear by the next moring it no longer is working. It maybe setup correctly and we may need to look at other options.
    sharon
    "not working" is pretty vague. Can you give us more details. You have to setup the query filters on the zone and allow the other server to query. Other than that I would think you just point your windows DNS slave zone to the OES. Should work (I do it all the time between NW and OES and OpenSuse)

  • File transfer between macbook and other macs

    I need to do a file transfer between my macbook and a macbook pro and my macbook and an ibook. What ways can you recommend? FTP is slow for me because it's a campus network, and bluetooth takes a while too (in the ranges of 1 to 2 hours). I believe it can be done through firewire and/or USB right?
    If I were to use USb, is there any special software needed? Or will each mac detect each other as a hard drive or?
    What about firewire? I understand it needs to have a hub or a server? Is that true? Can file transfers be done just via a firewire alone?
    Thanks!

    If you want to access your friend's files, he should enable File Sharing on his machine and you should choose Finder > Go > Connect to Server... His machine should show up in the Connect to Server dialog box. This should work with ethernet; AFAIK USB will not work.
    From Mac Help:
    "You can use an Ethernet cable to connect two Macintosh computers and share files or play network games.
    1. Connect an Ethernet cable from the Ethernet port of one computer to the Ethernet port on the other.
    2. Open Sharing preferences on both computers and turn on Personal File Sharing. Note the Computer Name for the computers.
    3. On one of the computers, choose Go > Connect To Server and then click Browse.
    4. Double-click the other computer in the window and enter your password, if necessary.
    If you manually configured the TCP/IP settings for the built-in Ethernet configuration on the computer you are connecting to, you may need to enter that computer's TCP/IP address in the Connect To Server dialog.
    To see or set the TCP/IP address, open Network preferences and choose your Ethernet port configuration from the Show pop-up menu (named Built-in Ethernet unless you gave it another name)."

  • Web JetAdmin 10.2 (SR5) Fails to start on server 2003 and 2008

    Web JetAdmin 10.2 Fails to start : console not running - "Waiting for service start"
    Please help to troubleshoot the problem

    The OLD SQL DB has "SQL_Latin1_General_CP1_CI_AS"
    The NEW SQL DB has "Latin1_General_CI_AS"
    The following error is shown in HPWJAService-XXXXXXXXXX.itl under DEVICE:\Documents and Settings\NetworkService\Local Settings\Application Data\Hewlett-Packard\HPWebJetadmin\WjaService\tracing
        * Database initialization failed: Error 468, Level 16, State 9, Procedure -, Line 15, Message: Cannot resolve the collation conflict between "Latin1_General_CI_AS" and "SQL_Latin1_General_CP1_CI_AS" in the equal to operation.
    and
        * Unable to determine managed schema version: Schema version information not found.
    I think this is the problem...

  • Time zone difference between SRM and R/3

    Dear Experts ,
    We are using SRM extended classic scenario and using PR transfer but I got an error which time zone Turkey is not valid.
    When I check SRM and R/3 , SRM does not have turkey.
    I checked OSS notes but I did not found .
    Can anyone help me about the problem.
    Regards,
    Tuluhan,

    Hi,
    try the following notes
    1569627
    198411 may be helpful!
    regards,
    kiran

  • Change of material master during transfer between SRM and R/3

    Hi,
    I have a question for modifying material master data between R/3 and SRM.
    I would like to know if it is possible to change the material number during the transfer. The reason is:
    I have 2 R/3 backends and 1 SRM system. The material codes in the 2 R/3 systems are different but some materials having different numbers in R/3 are referring to the same material. I want to see such materials with the same material number. Is it possible to do this with Middleware plus some interruption in between? The "translation" should be bi-directional.
    Thanks

    Hi,
      No probs if the field is not replicated.No idea how to bring that through replication though.
      Coming to the Product Search...Although the number of such materials is less...You will have to modify the Product search which will take care of the  display of other products as well.
      The Std product search(<b>bbph_product</b>) uses the the search help exit "<b>BBP_F4IF_SHLP_EXIT_PRODUCT_GEN</b>".So you need to replace this one with your custom FM.
      In your custom FM,you will have to check for the HARMINIZED code field for all such materials and accordingly while selecting the Material select the HARMONIZED code as the Material number and in the Product search display that HARMONIZED code as the "PRODUCT ID". Also,you  will have to develop RFC enabled FM which will actually fetch all the Materials from the BAckend at runtime. Also are you displaying the Product List Plantwise?
      Pls give me ur mail ID where i can mail you the  sample code.
    BR,
    Disha.
    Pls reward points for helpful answers.

  • New 2012 server in a mixed 2003 and 2008 domain (in process of upgrading)

    We are replacing a Windows Server 2003 machine which crashed and is gone, with a Windows Server 2012 Standard machine.  The old 2003 Server was a domain controller running along side one other 2003 server (which is getting replaced next) and
    3  Windows Server 2008 R2 Standard x64 domain controllers which up and running.  When trying to add the Windows Server 2012 Standard server as a domain controller to an existing domain, we are getting the following error:
    Verification of replica failed.  The forest functional level is Windows 2000.  To install a Windows Server 2012 domain or domain controller, the forest functional level must be at Windows Server 2003 or higher.
    However, the domain was already brought up to at least the 2003 level when we added the current live 2008 domain controller (Windows Server 2008) several years ago.  When I now try to run Adprep on the Windows Server 2008 (adprep from the 2008
    install CD) I get the following responses:
    Command:  adprep /forestprep
    Response:  Forest-wide information has already been updated.  [Status/Consequence] Adprep did not attempt to rerun this operation.
    Command:  adprep /domainprep /gpprep
    Response:  Domain-wide information has already been updated.  [Status/Consequence] Adprep did not attempt to rerun this operation.
    I have gone to Active Directory Domains and Trusts on all of the other servers and each one is at a Windows Server 2003 Functional level and states that I can not raise the level because I have AD Dc's that are not running the appropriate version of
    windows. And I get that due to the remaining 2003 server, but none are at Server 2000 level.
    So it seems we have a conflict where the 2012 server thinks the domain is at the Windows 2000 level. Is there any way around this, or a way to find out where the conflict is coming from?
    Thank you
    Kevin C

    Please proceed like the following:
    Run netdom query fsmo to identify the current FSMO holders. It seems that the old DC was holding FSMO roles. If this is the case then seize them to another DC: https://support.microsoft.com/en-us/kb/255504
    Do a metadata cleanup to remove the old DC reference: Use dsa.msc
    and then remove the old DC computer account. Also, use dssite.msc
    and remove the NTDS settings of the old DC then remove its references over there
    After doing this, check again and try to raise the DFL and FFL. Do not forget to check that your DCs and AD replication are in healthy state using
    dcdiag and repadmin commands.
    This posting is provided AS IS with no warranties or guarantees , and confers no rights.
    Ahmed MALEK
    My Website Link
    My Linkedin Profile
    My MVP Profile

  • SAP Server upgrade from Windows 2003 and 2008 R2 with Oracle Database

    Hi,
    We need to upgrade upgrade our Windows servers that are running our SAP systems from Windows Server 2003 to Windows Server 2008 R2.  We are using Microsoft Clustering for HA so an inplace upgrade seems not possible.
    Someone has suggested to us that we will need to export our database and reimport it to achieve this upgrade however but I cannot understand why this would be necessary (we are not changing the underlying filesystem!).
    Could someone please confirm whether a database export and import is required for this OS upgrade scenario?  I have done a bit of research but nothing has jumped out and now I need an answer to this quickly.
    We are running
    - ERP 6.0 NW 7.01 (soon to be 7.02 with ABAP stack only)
    - XI(PI) Java & ABAP
    - SRM (Java & ABAP)
    - Portal (Java only)
    - PLM (Java & ABAP)
    - BW (BI 7.0) (Java and ABAP)
    Thank You
    Felicity

    Hello,
    You need to go for 'Homogeneous System Copy' to achieve this, but since almost all the systems in your landscape include the Java Stack as well - so the system copy with Export/Import is to be carried out.
    Even if you are not going for file system change, but it is a Windows environment and you can't put SAP up on the target Windows (windows 2008 R2) just by copying the contents and file system from source to target. You need SAPinst to create the registry and all. - On top of that you have Java stacks involved, so for java stack you can't carry out just backup/restore method to put SAP up there on target - so you need Export/import because for java stack some OS level dump is to be collected during system copy from source and it needs to be imported on target OS.
    Are you clear on this one ?
    Read system copy guide once and Search in OSS for the Notes to check how to upgrade from Windows 2003 to Windows 2008 R2.
    Thanks

Maybe you are looking for

  • Maximum Screen Resolution for a Compaq 1200 Laptop?

    I was just recently allowed to turn one of the laptops lying around into an Arch Linux laptop (title states model). It's working swell...it's just that I do not know what is the maximum resolution for it. 800x600 seems too small... If any of you need

  • How to start workin with WebLogic 11g

    Hi!. can anyone out there guide me with how to start working with Weblogic.. i have made a jsp page but cant deploy it.. actually i know how to on Weblogic 8 but not on this version of the S/W... Please give a detailed guiding about where to paste th

  • How to load data to dimensions and fact table is it simialr to sql server

    How to load data to dimensions and fact table in a data ware house environment, is it simialr to sql server, oracle? Never migrated or loaded data to datawarehouse server. I have an interview and am really confused if they ask anything related to the

  • ALE help

    Hello Frnds, I want to transfer data for material master from server1 to server2 using ALE. But I dont want the transfered data creator as ALE-USER, instead I want it to be the same user who created it in server1. How can I do that? Any fumction modu

  • Time in windows

    So, after replacing arch with ubuntu on one of my machines the time is messed up on my windows partition. When I installed arch I did this registry change on windows to make UTC work as described in https://wiki.archlinux.org/index.php/Ti - in_Window