Do integrity checks by file comparison really work?

Hello,
I use a program (Beyond Compare) which has a Folder Compare module, with a function for doing a binary comparison between the files inside two folders. I use it to check the integrity of my backups, by comparing my main hard disk files, with the ones on
the backup hard disks. If two files get recognized as being different, then one of them should be corrupt (if they didn't get modified "normally").
My question is: will this method work to detect real file corruptions? Will it detect any kind of corruption, both copy corruptions, and "bit rot" corruptions? Will there be no problem regarding read cache? I mean: if the first file to compare gets read
from disk, and is kept in a read cache (by Windows or by the hard disk, or by something else), will the second file get read from the same cache (if Windows or the hard disk think they are identical files)? Do Windows and hard disks have some kind of procedure
to detect if a file to read from disk is already available in cache, even if it is in a different folder than the "original" one? Perhaps some kind of file-checksum-system which decides that the files are same? (And this system would not notice if the file
to compare is corrupt). If this would be true, then integrity checks by file comparison would not work, because in practice the same file would be read twice (first from disk, and then from cache), instead of reading both the two files to be compared from
disk.
I have already done tests by manually "corrupting" files (changing slightly the contents, while keeping size and timestamp the same), and it works (the files get recognized as different). But I'm not sure if it will work also with "real" corrupt files.
I'm interested mostly about Windows 8 Pro 64bit and NTFS (but would like to know also in general).
Thanks.

I also have Beyond Compare and have used it to check backup data.
Yes, Windows does RAM caching.  If you have a comparison program read enough file data that you
can be sure the RAM cache is flushed, then you can be sure that reading the file data back and comparing it with the original data is a valid way to ensure you have uncorrupted backups.
Note that NTFS now does online recovery, so just the act of reading back the data can trigger processes inside the file system implementation that will recover the data if it should start to experience "bit rot" (weakening or failure of the
storage medium).  But this is not strictly necessary as other operations, such as defragmentation etc., will occasionally access the disk data as well.
Some time back I wrote a small utility that calculates a CRC-32 on all the data from all the files in a set of folders.  This is another way that re-reading all the data can be triggered, as well as producing a summary number that can be easily
compared to determine that all is well - though one doesn't need my software to do it...  There are hash programs available that can accomplish the same things.  Search for SHA-1 programs, but beware there can be malware associated with free programs
and download sites.
It's good that people think about data integrity.  There's all too little of that nowadays.
-Noel
Detailed how-to in my eBooks:  
Configure The Windows 7 "To Work" Options
Configure The Windows 8 "To Work" Options

Similar Messages

  • Periodic Message: You have some checked-out files in the working directory

    Hi,
    we are recieving a periodic message (appearing every 5 minutes) saying:
    "You have some checked-out files in the working directory. We recommend that you check in the files after editing. Click here to checkin the files.
    After clicking on the message easyDMS is opened and Private- & Public-Folder are shown.
    There are no checked out files on the client.
    What happened?
    Thanks a lot!
    Edited by: B Lobascio on Feb 8, 2011 5:13 PM

    After changing the Working directory in easyDMS preferences the message did not apear anymore.
    That means that there have been some some damaged documents in origin working directory.
    Edited by: B Lobascio on Feb 16, 2011 10:35 AM

  • SharePoint and FrameMaker 10 CMS/Integration - Checked Out Files Stuck

    Hello,
    I originally set up the CMS connection to my SharePoint through FrameMaker using the domain name as a part of the user name (as suggested in a previous post), but have still run into this same issue discussed in that post, (unalterably locked files), and have been completely unable to resolve it.
    When I check out a file, it gets "stuck" in that checked-out/locked state, and I'm unable to edit, check in, or discard my check out. The only thing I'm still able to do is view the files as "Open (Read only)," and delete the files.
    I've even tried to access the files through the SharePoint site itself (as well as through SharePoint Workspace) in an attempt to check the files back in, but this command only returns an error about the files being checked out to a local draft, or one claiming the file may not even be checked out in the first place, etc. The only thing I seem to be able to do is delete the files from the server.
    Thus far, I've been completely unable to use this tool.
    Does anyone have any other suggestions? I'm at a loss.
    Thank you,
    Carolyn

    Hello Vimal -
    Thanks so much for your reply.
    The login name that I used to connect to the SharePoint repository is in the format DomanName\EmployeeNumber. However, in the "Modified By" field of the Properties dialog, my full name is listed (last name, first name). Attempting to log in with the credentials DomainName\Full Name (as listed in the properties dialog) results in an error message reading: "The request failed with HTTP status 401: Unauthorized".
    Working outside of FrameMaker's CMS, i am indeed able to upload, check out, edit, and check in files successfully. The problem, however, with working with the files from outside of FrameMaker itself, is that SharePoint doesn't understand file dependencies upon upload. So, when I open a checked out *.book file from the server, it cannot locate the individual (also checked out) chapter *.fm files necessary to edit the document - which means I can only work with these files through FrameMaker. And, it's only within FrameMaker's CMS that I'm running into an inability to check documents back in or edit them. Bummer...
    Thanks again for responding. Any ideas welcome!
    Carolyn

  • Hi, My Lightroom CC is showing waiting for connection and the mobile sync is not working. I have tried deleteing lrsync data, disabling firewall, checked host file etc. Still no luck.

    Hi, My Lightroom CC is showing waiting for connection and the mobile sync is not working. I have tried deleteing lrsync data, disabling firewall, checked host file etc. Still no luck.

    I had the same problem. It fixed itself when I opened Internet Explorer (I don't normally use it). I was opening Support Portal to get some support help, but I'm not sure if it was the page or simply opening IE that did the trick.

  • How do you check your dedicated GPU is working?

    How do you check your dedicated GPU is working?
    Some strange things have been happening since my MBP (2009, SSD, 8GB, 17") was updated to Lion:
    1. Icon images load slower than usual, e.g when opening the applications folder the images take a fraction longer than would be normal
    2. If re-naming a folder, you see the new title, then the old one, then it switched back to the new one.
    3. When playing HD video, or rapid desktop switching the picture looks split in two.
    I checked that my GPU was on dedicated, but it had changed back to Intel integrated? Then a re-start and I checked it was on dedicated GPU, which it was BUT the above problems still remain.
    Plugged in a 24" LED Display via display port and it does not come on - only the USB ports work. Does a display need a dedicated GPU to work?
    Zapped Parameter Ram, rebuild permissions, and reset SMC all to no avail. Techtools pro 6.0 finds both cards but can't check them. I assume its not Lion compatible yet?
    MBP is still under Apple Care, but I want to know if I can identify the problem is the NVIDA GPU. But how?
    SBB

    Disconnect other hardware
    Run a hardware test & safe mode see if it goes away
    Safe Mode
    http://support.apple.com/kb/HT1455
    Reset SMC
    http://support.apple.com/kb/ht3964
    PRAM/NVRAM
    http://support.apple.com/kb/HT1379
    Hardware Test
    http://support.apple.com/kb/HT1509
    Backup files to a exteranl drive and disconnect , c boot off 10.6 installer disk and simply over-install OS X onto the installe version run the Combo Update
    http://support.apple.com/kb/DL1399
    That should take of the faulty driver if there is one.

  • Error while installing Adobe Reader "Did not pass integrity check"

    When the download manager gets to about 80% this error pops up every time, I have tried clearing the temporary files, restarting my computer donwloading an older version but nothing works.
    The entire message says:
    "Adobe Reader":
    "Adobe Reader":
    The Download did not pass the integrity check (16291.304.428)
    I have windows 7, 32 bit. Help please!

    Don't use the download manager; download the installer directly from http://ardownload.adobe.com/pub/adobe/reader/win/10.x/10.0.0/en_US/AdbeRdr1000_en_US.exe (English version 10.0 for Windows).
    Use this link if you need another version http://get.adobe.com/reader/enterprise/

  • TIME MACHINE - How does it really work?

    Dear Mac users and Apple;
    I am new to Mac. For most of my life I used Windows OS. One of the cool features on my new iMac 24'' is TM. In the past I used to use Norton Ghost for creating back ups and safeguarding my data.
    Though there is a lot of info on www.apple.com and in the forums I can not find any satisfactory information about the true way of how Time Machine (TM) works.
    Below I describe all my concerns, but in order to put them into perspective here is what I was used to on PC while using Norton Ghost. I would run a one time back up of my whole system drive (C:) and other drive (D:) which had my documents on it and then schedule regular back ups (weekly in my case). These were incremental back ups which only " added" the files which changed in the mean time. Of course, this method also meant that anything done between the back ups was at risk of being lost until next back up took place.
    When I stated using TM, I assumed a lot of how it works based on my experience described above. However, I ran across this forum http://discussions.apple.com/thread.jspa?messageID=6096304&#6096304 and got pretty confused as to how it really works. I searched for explanations but all I find is us users guessing. I believe Apple should provide its customers with clear explanation and documentation how TM works.
    If you read the long formum thread you will understand what I am talking about. If not here is a rough summary.
    A guy backed up his data using TM, then after checking it is there he deleted the originals from the source drive only to find out later that the back ups were deleted from TM as well.
    The thread goes on and these are the questions I have as a result - I kindly beg you to answer them for me:
    1. TM does 1st back up of the whole Mac, at first - correct? This data stays until TM runs out of space on the back up drive, correct?
    2.)TM does hourly back ups which are kept for 24 hrs (one day), then only a daily back up stays - which one out of the 24 is considered the daly back up? Or is it a separate back up?
    3.)TM does daily back ups which stay for a month then are deleted?
    4.)TM does weekly back up and this back up stays until TM runs out of space on the back up destination drive, correct? If yes, when does it take place?( 7days from the initial back up and then again each 7 days?
    5.) If I create a file and then delete it before hourly back up stores it, it is not backed up at all, correct?
    6.) If I create a file and then delete it once it was backed up by the hourly back up is it stored for 24 hours (one day) and then deleted?
    7.) If I create a file and delete it once it was backed up by the daily backup, it stays for a month in the back ups of TM but then is deleted?
    8.) If I create a file and it is backed up by the weekly back up, then is deleted - it resides in the weekly back up until TM runs out of space on the destination back up drive and deletes the original weekly back up, correct?
    From this, if my assumptions above are right, it means that only files backed up by the weekly back up are kept until back up destination drive has space left. Then it would mean that it is important for me to know when the weekly back up takes place, so that I know that "now is the time and the files I have on my source drive now will stay in the target back up drive as long as it has space".
    Further, it would mean that the hourly and daily back ups are kind of temporary back ups which give me the chance to restore the file within 24 hrs if backed up in the hourly back up, or restore it within one month if backed up by the daily back up. If backed up by the weekly back up, then it can be restored any time as long as the back up drive is not full. Is that correct?
    IF I do a forced back up (Ctrl click on TM in the dock and then order it " Back up now") is this a back up which stays until back up external disk is full or is it a temporary back up (like the hourly and/or daily)?
    Another issue is as follows: If I delete an application from my hard drive BUT it was backed up in the past by TM - can that application be launched automatically without first asking for my (users) permission from the TM's back up on the eternal hard drive? E.g. I install Adobe reader - it is backed up by TM. Then I uninstall it. Then, I download a PDF document, click it to open - does Adobe reader open from the TM back up? If NOT - great, that is how I expect it to work. If YES, then does it warn me about being opened from TM back up? YES it does warn me - great work apple. NO it does not - why? If the later is the case then if you exchange Adobe reader for spyware or malware or a virus, you have a vulnerability how viruses etc. can be reintroduced to system from TM back ups even after cleaned from the original hard disk. Please let me know how it really works.

    1. TM does 1st back up of the whole Mac, at first - correct? This data stays until TM runs out of space on the back up drive, correct?
    Respectively right and wrong. Out of all the hourly backups it makes, TM keeps only the first of each day for longer than 24 hours and one (presumably the first) of each week longer than one month. Any data which is not part of one of the retained backups is lost. Clearly this may happen long before the drive runs out of space and TM begins deleting the retained backups (oldest first).
    2.)TM does hourly back ups which are kept for 24 hrs (one day), then only a daily back up stays - which one out of the 24 is considered the daly back up? Or is it a separate back up?
    TM keeps only the first backup of each day for longer than 24 hours. When you bring up a daily backup in TM's browser, you will see on the big shiny black bar along the bottom not only the date but the precise time of the backup you are looking at.
    3.)TM does daily back ups which stay for a month then are deleted?
    One per week is retained (presumably the first but Apple isn't saying and I haven't seen that confirmed by anyone yet). The others are deleted.
    4.)TM does weekly back up and this back up stays until TM runs out of space on the back up destination drive, correct? If yes, when does it take place?( 7days from the initial back up and then again each 7 days?
    Don't know about timing. By analogy with the daily backups I would guess first of the week. But remember this is not a separate backup, it is one of the original hourly backups which survives the progressive culling process.
    5.) If I create a file and then delete it before hourly back up stores it, it is not backed up at all, correct?
    Correct.
    6.) If I create a file and then delete it once it was backed up by the hourly back up is it stored for 24 hours (one day) and then deleted?
    It is stored for at least 24 hours. If it was never on the first backup of any day it will not survive longer. Otherwise it will.
    7.) If I create a file and delete it once it was backed up by the daily backup, it stays for a month in the back ups of TM but then is deleted?
    If it was in the first backup of any day then it will be in the backup with that date so long as that backup is retained (at least 30 days and until the drive is full if that day is retained as the weekly backup).
    8.) If I create a file and it is backed up by the weekly back up, then is deleted - it resides in the weekly back up until TM runs out of space on the destination back up drive and deletes the original weekly back up, correct?
    If it is in the backup chosen as the weekly backup it will be available until that backup is deleted by TM for lack of space.
    From this, if my assumptions above are right, it means that only files backed up by the weekly back up are kept until back up destination drive has space left. Then it would mean that it is important for me to know when the weekly back up takes place, so that I know that "now is the time and the files I have on my source drive now will stay in the target back up drive as long as it has space".
    I wouldn't advise thinking like this. If you wish to retain long term a particular file TM is not the answer. You should separately archive it outside TM. As has been pointed out elsewhere, you could just copy it to the drive TM is running on, but outside TM's backup folder (however note that this will reduce the space available for TM's backups).
    Further, it would mean that the hourly and daily back ups are kind of temporary back ups which give me the chance to restore the file within 24 hrs if backed up in the hourly back up, or restore it within one month if backed up by the daily back up. If backed up by the weekly back up, then it can be restored any time as long as the back up drive is not full. Is that correct?
    Correct.
    IF I do a forced back up (Ctrl click on TM in the dock and then order it " Back up now") is this a back up which stays until back up external disk is full or is it a temporary back up (like the hourly and/or daily)?
    Good question. I have no reason to think that forced backups are priviledged in any way, but who knows? Anyone got any comment?
    I install Adobe reader - it is backed up by TM. Then I uninstall it. Then, I download a PDF document, click it to open - does Adobe reader open from the TM back up?
    Nothing in the backups is available to you until you have restored it. Interesting question is: if an application is restored, are any other files its original installer would have placed in the system and are needed to run it restored as well? One would like to think so but maybe that's too hard for TM? I would recommend that any major application that has been disinstalled be reinstalled from the original disks rather than relying on TM. Of course, if you've just accidently trashed an application file you can restore it from TM no problem.
    Only after restoring it from TM can you use it.
    Hope these comments help and anyone with more info adds their piece. TM is proving more confusing than it looked at first glance!

  • VISA read in exe file is not working

    Hi all,
    I am having problems with VISA read in an exe file created.
    I am trying to write to and read from a programmable power supply via RS232. The VI writes a command to the instrument to set the voltage level. It then writes another command, requesting the resulting current value. This value is then read by VISA read
    The VI is working fine on the development PC, which has LabVIEW installed. The exe file is also working fine on this PC. However, when I try to run the exe file on another PC (I've tried several) everything seem to work except for the VISA read functions. The voltage level command is sent, as well as the On and OFF commands, but the current is not read back.
    I guess there must be something I have missed in the installation. I am working in LabVIEW 8.5. I have created an installer and included
    Runtime Engine 8.5.1
    VISA runtime 4.5
    Is there something else I should do? I am really running out of ideas here...
    I hope someone has a clue about this!
    Clara 

    Clara-
    1. Have you verified that the COM port settings in Windows (check under device manager) are matching how you initialize them (Baud, bits, parity, and flow control) and that these match the power supply's settings?
    2. Also, are you trapping an error message after the attempted Read command (this will make it a lot easier to diagnose).
    3. Do you programmatically close the VISA session at the end of the program?
    4. You can always post the code to see if the forum members will catch the porblem.
    ~js
    2006 Ultimate LabVIEW G-eek.

  • WRT54GC - Does the DHCP Server really work?

    Hi everyone,
    Im new to this forum so i dont know if this problem has been discussed before and is there any solution to it.
    I own a WRT54GC since January 2007 and have had no problems since then, but recently I found out some issues about the DHCP server on this unit:
    This is what i have on the network for you guys to understand my network setup:
    Cogeco Cable Modem for the internet (Im located in Canada), Linksys WRT54GC router, One Trendnet File/Print Server for my USB all in one Lexmark printer, and external hard drive, and an Acer laptop.
    The network works perfectly (wireless internet, wired internet, printing, file transfer) except for the following DHCP issue:
    DHCP server does not update the IP address of my laptop even if in the router settings i assign 5 mins or any minuites to the lease time of the IP address and I keep my laptop on and connected to the router either wireless or wired.
    The router has been enabled for DHCP and can assign the IP address automatically to my laptop upon powerup. I also have allocated 6 users for DHCP and starting IP at 192.168.1.101 which means that the range is from 192.168.1.101 to 192.168.1.106
    When I check who's connected to the network, I see that my lap is connected to it and the expiry of the lease time is given. I can also see that if the lease time is set to 5mins then it down counts until zero and it defaults to some wiered number after the time is over and doesnt change the IP address of my laptop. Looks like a bug in the DHCP server software. The above was tried when using the wireless connection.
    When I connected the laptop using the ethernet cable to the router (wired connection) a somewhat similar behaviour but instead if I set a lease time of 5mins or any other mins, the server does not recognize this and defaults it to a lease time of a day (similar to setting 0).
    I have configured my laptop for DHCP so I dont think there is anything wrong with my laptop but I do feel that the DHCP server on this WRT54GC is working abnormally.
    Any suggestions?......Try it out first and see if you have the same problem.
    If you dont than i will be glad if you could tell me how you were able to make it work!
    Cheers.
    (Edited post for guideline compliance. Thanks!)
    Message Edited by JOHNDOE_06 on 10-08-2007 11:26 AM

    Thanks for your reply. Im not looking forward to change my IP address every 5 mins. My question is more related the functional behaviour of the DHCP server. With all the proper settings (and I belive I have done the settings correctly), the DHCP server should update the IP addresses of any computer on the network after the lease period ends and provided the computer was on for the time the lease period is valid and not shut off. Even if I use the default lease setting of a day, it still does not change the IP address if I keep my laptop on for more than a day or so. I feel that the DHCP server is not functioning as it should. I dont really need to change my IP address and Im happy if it doesnt change, but I came across the abnormal behaviour when I was going through the setup. I just thought I should try to see how the DHCP server on this unit works and to see that it was not working as it should I thought I should ask someone on the community forum to see if this is really a normal behaviour or has linksys distributed a faulty product. My network works fine and Im the only user so Im not concerned about security. Im using WEP. To answer your other questions: There is no specific reason as to why I chose to limit DHCP to 6. I think 6 is good enough for me and as no one else is using my network I think 6 is an enough number for DHCP server to assign an unused IP address to my laptop after the lease period expires. I believe I dont need 50. The reason why I used 5 mins as lease time is because I wanted to test the DHCP server. 5 mins is short and I can wait and monitor to see if my IP address gets updated. Waiting for a day will be overkill for the simple test. Did you try it on your router? I'd be interested to know how it worked for you if it really worked. Thanks for you help!

  • Designer gets closed when integrity check is run - OLAP Universe (MSSAS)

    Hi All,
    I have an OLAP universe (Designer XI 3.1).  The source cube has been built and deployed using MS SQL Server Analysis Studio 2005.  When i try to run an integrity check to parse objects the designer closes down automatically.  This issue is only with OLAP universes, other universes work just fine.  In fact I remember the same universe was working perfectly well a few months back.  Recently I had to re-install BOXI 3.1 due to some issues with the registry files but after the re-installation this is the first time I am working on an OLAP universe.
    I checked MDA.log file and found the following error logged there multiple times (each time i ran the integrity check):
    Failed to locate the LogFormat settings in the registry. In this case, the LogFormat will use the default format.
    I searched the net for a possible solution but couldn't find anything useful.  Has anybody of you faced a similar problem?  Is this a registry issue?
    Please help!!
    Thanks in advance!!

    Hi Sonia,
    How did you solve the problem? We are also facing problems with Hierarchies in the BW Universe. Please share the knowledge, so we (someone else) can also be beneficial.
    Regards,
    Bhavesh

  • [SOLVED]package integrity check fails

    I recently ran a pacman -Syu so it downloaded all the files I wanted but when it went to install the packages, the integrity check failed at opera, the error I got was: http://pastebin.com/m447d9848
    my questions are, what is the problem? how can I fix it? Is the package corrupted, if so how can I delete it? will deleting the package fix it? thank you for your help
    Last edited by MONODA (2008-06-24 06:10:34)

    Well,  it isn't the package integrety that is failing.  You just have file conflicts on your system.
    Now to fixing them.  Most of these are to do with opera9.50.  Did you previously install opera 9.50 on your system without using pacman?  You should be safe doing a "pacman -Sf opera" to get rid of this.  I would do a "pacman -Qo <file>" on a few of those files to check nothing owns them first, just to be really safe.  You can do the same thing with shared-mime-info.
    The /usr/local/share/man issue is a bit more compllicated.  Check what files you have in that directory.  I would move them all to a temporary directory, install the filesystem package, then move them back.  The filesystem package now symlinks /usr/local/man and /usr/local/share/man

  • Satellite L305D-S593​4 stuck on "checking system files" screen

    Hey guys, so let me start off that I'm really not tech savvy and I've had this laptop since around 2009 and over the years it's had some problems but the most recent being that whenever I try to start up the computer, it'll give me a quick options menu and then enter into the "checking system files" screen, which is fine and everything until it's at 56%. It does this every time I try to restart the laptop and it just completley stops there even if I leave it alone for an hour or two. Hope you guys can help with my dilemma and thanks in advance!

    Hi ! See if any thing here works for you!
    http://answers.microsoft.com/en-us/windows/forum/w​indows_7-system/laptop-stuck-at-checking-file-syst​...
    http://answers.microsoft.com/en-us/windows/forum/w​indows_7-performance/stuck-at-checking-file-system​...
    PS If you have some bad sectors;which I'm suspecting you might have, this might take several hours to complete. When I run mine it takes nearly 3 hours and I have no bad sectors.
    Dokie!
    I Love my Satellite L775D-S7222 Laptop. Some days you're the windshield, Some days you're the bug. The Computer world is crazy. If you have answers to computer problems, pass them forward.

  • [solved] pacman 4 hangs after "checking for file conflicts"

    Like others, I removed yaourt and package-query because they conflicted with pacman4... not worried about that, I'll reinstall them later.
    I put the new pacman.conf in place (my old one wasn't really customized).  I left SigLevel = Never.
    Now, I can run pacman -Sy fine, but if I try to install anything, I it just hangs:
    sudo pacman -S audacity
    resolving dependencies...
    looking for inter-conflicts...
    Targets (1): audacity-1.3.14-2
    Total Download Size: 3.21 MiB
    Total Installed Size: 15.29 MiB
    Net Upgrade Size: -0.00 MiB
    Proceed with installation? [Y/n]
    :: Retrieving packages from extra...
    audacity-1.3.14-2-x86_64 3.2 MiB 1397K/s 00:02 [###########################] 100%
    (1/1) checking package integrity [###########################] 100%
    (1/1) loading package files [###########################] 100%
    (1/1) checking for file conflicts [###########################] 100%
    I've waited up to 20 or 30 minutes and nothing happens.  It's not just audacity, any package I try to install does this.
    Suggestions?
    Last edited by TheAmigo (2012-01-17 18:55:38)

    With --debug switch it prints:
    checking for file conflicts...
    debug: looking for file conflicts
    debug: searching for file conflicts: coreutils
    debug: searching for filesystem conflicts: coreutils
    debug: searching for file conflicts: ethtool
    debug: searching for filesystem conflicts: ethtool
    debug: searching for file conflicts: fail2ban
    debug: searching for filesystem conflicts: fail2ban
    debug: searching for file conflicts: gpgme
    debug: searching for filesystem conflicts: gpgme
    debug: searching for file conflicts: vim-runtime
    debug: searching for filesystem conflicts: vim-runtime
    debug: searching for file conflicts: gvim
    debug: searching for filesystem conflicts: gvim
    debug: searching for file conflicts: hdparm
    debug: searching for filesystem conflicts: hdparm
    debug: searching for file conflicts: inetutils
    debug: searching for filesystem conflicts: inetutils
    debug: searching for file conflicts: lib32-glibc
    debug: searching for filesystem conflicts: lib32-glibc
    debug: searching for file conflicts: lib32-gcc-libs
    debug: searching for filesystem conflicts: lib32-gcc-libs
    debug: searching for file conflicts: lib32-glib2
    debug: searching for filesystem conflicts: lib32-glib2
    debug: searching for file conflicts: lib32-gdk-pixbuf2
    debug: searching for filesystem conflicts: lib32-gdk-pixbuf2
    debug: searching for file conflicts: lib32-pango
    debug: searching for filesystem conflicts: lib32-pango
    debug: searching for file conflicts: lib32-gtk2
    debug: searching for filesystem conflicts: lib32-gtk2
    debug: searching for file conflicts: linux
    debug: searching for filesystem conflicts: linux
    debug: searching for file conflicts: nspluginwrapper
    debug: searching for filesystem conflicts: nspluginwrapper
    debug: searching for file conflicts: nvidia
    debug: searching for filesystem conflicts: nvidia
    debug: searching for file conflicts: qtwebkit
    debug: searching for filesystem conflicts: qtwebkit
    debug: searching for file conflicts: rpcbind
    debug: searching for filesystem conflicts: rpcbind
    debug: searching for file conflicts: unrar
    debug: searching for filesystem conflicts: unrar
    debug: searching for file conflicts: xscreensaver
    debug: searching for filesystem conflicts: xscreensaver
    checking available disk space...
    debug: checking available disk space
    Without the --debug switch
    Proceed with installation? [Y/n]
    (21/21) checking package integrity [############################] 100%
    (21/21) loading package files [############################] 100%
    (21/21) checking for file conflicts [############################] 100%
    Note that the 'checking available disk space...' is not printed without the --debug option although it doesn't look like being 'debug output'.
    I don't see much disk activity after that and the pacman process uses no CPU time and the process status goes to D in `ps` (man ps says: D: Uninterruptible sleep (usually IO)).
    Last edited by drrossum (2012-01-18 21:58:28)

  • [solved] "pacman -Syu" hangs after "checking for file conflicts"

    This is my third attempt.  I let it run yesterday for 20 hours on the theory that maybe it was actually doing something.  It always completes "checking for file conflicts" but never goes any further.
    Does anyone have any suggestions how to get it to continue?
    [ken@xxxxx ~]$ sudo pacman -Syu
    :: Synchronizing package databases...
    core is up to date
    extra 1421.6 KiB 2.03M/s 00:01 [######################] 100%
    community 1775.0 KiB 3.00M/s 00:01 [######################] 100%
    :: Starting full system upgrade...
    resolving dependencies...
    looking for inter-conflicts...
    Targets (27): binutils-2.23-1 coreutils-8.20-1 cryptsetup-1.5.1-1
    device-mapper-2.02.98-1 emacs-24.2-2 filesystem-2012.10-2
    firefox-16.0.2-1 gcc-4.7.2-2 gcc-libs-4.7.2-2 glibc-2.16.0-5
    hwids-20121022-1 imagemagick-6.8.0.3-1 libidn-1.25-1
    libwbclient-3.6.9-1 linux-api-headers-3.6.3-1 lvm2-2.02.98-1
    mkinitcpio-0.11.0-1 nspr-4.9.3-1 nss-3.14-1
    nss-myhostname-0.3-3 smbclient-3.6.9-1 systemd-195-2
    thunderbird-16.0.2-1 tzdata-2012h-1 util-linux-2.22.1-2
    wget-1.14-2 xulrunner-16.0.2-1
    Total Download Size: 47.80 MiB
    Total Installed Size: 550.57 MiB
    Net Upgrade Size: 6.43 MiB
    Proceed with installation? [Y/n] y
    :: Retrieving packages from extra...
    libwbclient-3.6.9-1... 19.5 KiB 407K/s 00:00 [######################] 100%
    smbclient-3.6.9-1-x... 7.9 MiB 2.71M/s 00:03 [######################] 100%
    thunderbird-16.0.2-... 17.1 MiB 2.92M/s 00:06 [######################] 100%
    xulrunner-16.0.2-1-... 22.9 MiB 2.92M/s 00:08 [######################] 100%
    (27/27) checking package integrity [######################] 100%
    (27/27) loading package files [######################] 100%
    (27/27) checking for file conflicts [######################] 100%
    Last edited by KenJackson (2012-10-30 14:25:05)

    Allan wrote:Can you run with --debug?
    OK.  That garnered an additional piece of info.  Here's the end of the long output:
    debug: searching for filesystem conflicts: wget
    debug: searching for file conflicts: xulrunner
    debug: searching for filesystem conflicts: xulrunner
    checking available disk space...
    debug: checking available disk space
    Disk space?  I think I have enough disk space.
    [ken@xxxxx ~]$ df
    Filesystem Size Used Avail Use% Mounted on
    rootfs 47G 11G 34G 25% /
    dev 2.0G 0 2.0G 0% /dev
    run 2.0G 292K 2.0G 1% /run
    /dev/sda3 47G 11G 34G 25% /
    shm 2.0G 140K 2.0G 1% /dev/shm
    /dev/sda1 99M 21M 74M 22% /boot
    /dev/sda4 72G 1.9G 66G 3% /home
    Is disk space really the problem?  Or is that just where it hung?

  • How to check the file size before loading it to the context

    Hello,
    I have an application to upload a file and write it to the server using the FileUpload UI and IWDResource Interface.
    I would like to limit the size of the file the user is uploading to, say, 2MB.
    The problem is that the current API doesn't allow me to check the file size before I store it in IWDResource. The API available for IWDResource:
    IWDResource resource = ...
    InputStream stream = resource.read(false);
    size = stream.available();
    Is working on the file only AFTER storing it in the context and the server's memory. This way, if a user decides to upload 1GB file for example, he can easily crash the server.
    I am already familiar with <a href="https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/00062266-3aa9-2910-d485-f1088c3a4d71">this</a> arcticle but it doesn't answer this question as well, all it does is calculating the file size only after storing it in the context.
    Any ideas...?
    Roy

    Hi Ram,
    Have you activated your Objects ?
    You need to activate your objects in order to see them in RWB.
    Thanks & Regards,
    Varun Joshi

Maybe you are looking for

  • The error showing as Fiscal year change has not carried out for the coCd

    Hi all, I am facing a very critical situation. My client has the last closed fiscal year as 2006(from 2006 april to 2007 march).For the fiscal yr. 2007(2007april to 2008 march) they have already finalised and submitted the reports made from some para

  • Create a project from list item not working if EPT has Project Plan Template

    HI, i am trying to leverage the new feature in 2013 to create a project from an idea list. This works fine if the EPT does not have Project Plan template assigned to it. Even through workflow action "Create Project from current list item". However if

  • BDLS conversion require manual corrections

    Hi, I am currently running BDLS in a system that is copied from BI Quality system. After the copy, I have run BDLS conversion for few source system connections. For the source system conversion from URD030 -> DV2CLNT030, I found that few tables requi

  • Imac won't start up because of a kernel panic version 10.8.0

    my 2009 intel iMac came on today with a panic error. have not been able to start up with Data Rescue. Instructions just say to restart, but I get the same thing over and over. Get the chimes and then the wjite apple screen, and then what looks like t

  • Logging the SOAP request/response from rules engine

    Hi All, We would like to log the soap requests and response from the rules engine. The determinations-server.war is deployed on a weblogic server and we are not using seibel connector. Is it possible to log the soap messages for debugging purpose. Th