Organizing large amount of files

We are moving many of our documents to Office 365 Sharepoint Online. A couple of the folders we are moving are deep and contain many subfolders and files. For example there is a customer history folder with files (mostly Excel) and subfolders for pretty
much every job ever done so it has around 10000 or so files/folders. I understand there is a 5000 file limit for Libraries in Sharepoint Online. What is the best practice? Re-organize the files and break them into libraries? To allow for growth I would make
them a lot less than 5000 each. Currently it is alphabetically, so is the best way to break it up (A-E, F-J...ect). Just looking for some thoughts/advice. We were hoping to not have to reorganize but it does seem it is necessary... right?
Thanks very much.

Hi CTNorthShore. The 5,000 limit you refer to is known as the list view threshold. It is not a physical limit on how much data a document library can hold but a limit on how many items a list view can operate on. Once a library exceeds that limit it can
affect some client tools - the main one being the OneDrive for Business sync tool.
A library in SharePoint Online can actually hold many hundreds of thousands of files if not millions and you can edit and browse those files fine in a web browser.
If you are planning on syncing all your files locally using the OneDrive for Business sync tool then you will need to break down your files in to separate libraries to keep below the list view threshold of 5,000 (or 20,000 if its your OneDrive for Business).
Myles Jeffery | Thinkscape Limited
Try Our File Share Migration Tool for OneDrive for Business and SharePoint Online Office 365

Similar Messages

  • Large amount of files

    I have a quite reliable AEB + airdisk set-up and can transfer large files. Wireless and wired speed is about 1Gb per 2-3 minutes.
    Everything is fine until I copy a large amount of files to the airdisk.
    I've tried 8000 files which went OK.
    I've tried a package with 22500 files and it fails. The connection drops and I have to restart the AEB.
    Are there any similar experiences that show that the number of files is a critical factor?

    It was no TM back-up, it's just a Lightroom catalog file that contains 22,500 items tat I drag-and-drop to the Airdisk.
    From my experience, that's likely to fail because AirDisk will most likely to crash when dealling with large number items in one single operation.
    Does looks like a memeory allocation problem on the firmware.
    I got around the problem by just plug the external usb hard drive directly to my Mac. I hope TC doesn't have similar problem, otherwise, it's would be much harder to get the HD out of the TC

  • Finder issues when copying large amount of files to external drive

    When copying large amount of data over firewire 800, finder gives me an error that a file is in use and locks the drive up. I have to force eject. When I reopen the drive, there are a bunch of 0kb files sitting in the directory that did not get copied over. This is happens on multiple drives. I've attached a screen shot of what things look like when I reopen the drive after forcing an eject. Sometime I have to relaunch finder to get back up and running correctly. I've repaired permissions for what it's worth.
    10.6.8, by the way, 2.93 12-core, 48gb of ram, fully up to date. This has been happening for a long time, just now trying to find a solution

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • ADT Not Working With Large Amount Of Files

    Hi,
    I posted a similar issue in the Flash Builder forum; it seems these tools are not meant for anything too big. Here is my problem; I have an application that runs on the iPad just fine, however I cannot seem to be able to package any of the resource files it uses, such as swf, xml, jpg, and flv. There are a lot of these files, maybe around a 1000. This app won't be on the AppStore and just used internally at my company. I tried to package it using Flash Builder, but it couldn't handle it, and so now I have tried the adt command line tool. The problem now is that it is running out of memory:
    Exception in thread "main" java.lang.OutOfMemoryError
            at java.util.zip.Deflater.init(Native Method)
            at java.util.zip.Deflater.<init>(Unknown Source)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:428)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:338)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:273)
            at com.adobe.ucf.UCFOutputStream.addFile(UCFOutputStream.java:247)
            at com.adobe.air.ADTOutputStream.addFile(ADTOutputStream.java:367)
            at com.adobe.air.ipa.IPAOutputStream.addFile(IPAOutputStream.java:161)
            at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:67)
            at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:163)
            at com.adobe.air.ADT.parseArgsAndGo(ADT.java:504)
            at com.adobe.air.ADT.run(ADT.java:361)
            at com.adobe.air.ADT.main(ADT.java:411)
    I tried increasing the java memory heap size to 1400m, but that did not help. What is disturbing though is why this day and age, the adt tool is written solely for 32 bit java runtimes, when you can no longer buy a 32bit computer. Almost everything is 64bit now. Being able to utilize a 64bit java runtime would eliminate this.
    Has anyone else had to deal with packaging a large number of files within an iPad app?
    thanks

    Hi,
    Yes the bug number is 2896433, logged against Abobe Air. I also logged one against Flash Builder for the same issue basically, FB-31239. In the tests I have done here, both issues are the result of including a large number of files in the package. By reducing the number of files included, these bugs do not appear.
    I would liketo try out the new SDK, but it won't be available until a few more weeks, correct?
    The main reason I want to include all these files is because the iPads will not always have a wireless connection.
    thx

  • Fastest way to "view" a large amount of files

    I am looking for a solution to speed up my workflow. I am previewing a lot of folders that contain up to 200 unique PSD files each. Each file can be anywhere from 10mb to 300mb. I am only previewing these files for design once, and will never see them again, but want to see them at a high enough res and size to approve them. Each file is a completely new and unique file, so I don't need to keep my cache, because I will never need to preview it again. I don't want to slow down the designers by having them create high quality thumbnails, so it is up to me to create them. All files are on a server, and my cache is on my hard drive. I have a lot of space, so I am not concerned with that, and clear it every so often.
    Here are the steps I am taking.
    1. I have created two workspaces: "Thumbnail" and "Preview". I start out in my thumbnail workspace, which has the slider way down so all files fit on screen. The thumbnail extractions start creating small thumbnails automatically, and this is usually very quick.
    2. Once the thumbnails are done, I select them all and right mouse click and choose "Generate High Quality Thumbnail". This is where things slow way down. This process takes a lot of time. I do this because it give me control over when the higher res. previews/thumbnails start, so I can view all the files at once, which is important.
    3. Once I have the high quality thumbnail, I select my preview workspace which has the slider way up. These get viewed, then moved over to "production", and I never see them again.
    Two Questions:
    1. If I go to a folder in my "Preview" workspace without generating the HQT, then it tries to create previews and thumbnails, not high quality thumbnails. What is the difference between a preview and a high quality thumbnail?
    2. And my big question is there a more efficient way of doing this workflow?
    Thanks in advance.

    As nearly everyone has its own workflow I would choose another one (Although
    I have no experience with files on a server, I have a stand alone workflow).
    You could choose to put all folders in one main folder and in Bridge
    Preferences under cache put a checkmark in front of the "automatically
    export cache to folder when possible".
    Point Bridge to this main folder and in the menu tools choose cache/ build
    and export cache. Also check the export cache to folder option.
    Hit OK and go for a coffee break because it needs some time (depending on
    speed and power of the machine, amount and size of the files) but Bridge
    should cache all subfolders and its content in one go.
    When you get back you only need to select a folder that will be already
    cached with high quality previews. Sort and rate, delete or whatever you
    want and even use your own workspaces. It only might take a little time to
    read the exported cache file when you point to that folder but then you can
    browse through al the files with HQ thumbs.
    Two Questions:
    1. If I go to a folder in my "Preview" workspace without generating the HQT,
    then it tries to create previews and thumbnails, not high quality thumbnails.
    What is the difference between a preview and a high quality thumbnail?
    2. And my big question is there a more efficient way of doing this workflow?

  • Mountain Lion Finder "unresponsive" (Beach balls) when trying to copy large amount of files

    Before Mountain Lion, using Lion -  transfering 999 files from my compact flash card to my SSD hard drive via a FW800 reader was SO easy. Select all, drag and drop and the transfer would begin immediately.
    Now, after upgrading to Mountain Lion, I experience at least 60 seconds of beach balls and an "unresponsive" finder. Sometimes it starts to transfer, sometimes I have to relaunch finder.
    I've reset the PRAM (although I didn't hear 4 beeps), the SMC, I did verify and repair disk permissions.
    Any thoughts on this?
    Thank you,
    (Late 2011 15 in MBP)

    Scott Oliphant wrote:
    iomega, lacie, 500GB, 1TB, etc, seems to be drive independent. I've formatted and started over with several of the drives and same thing. If I copy the files over in smaller chunks (say, 70GB) as opposed to 600GB, the problem does not happen. It's like finder is holding on to some of the info when it puts it's "ghost" on the destination drive before it's copied over and keeping the file locked when it tries to write over it.
    This may be a stretch since I have no experience with iomega and no recent experience with LaCie drives, but the different results if transfers are large or small may be a tip-off.
    I ran into something similar with Seagate GoFlex drives and the problem was heat. Virtually none of these drives are ventilated properly (i.e, no fans and not much, if any, air flow) and with extended use, they get really hot and start to generate errors. Seagate's solution is to shut the drive down when not actually in use, which doesn't always play nice with Macs. Your drives may use a different technique for temperature control, or maybe none at all. Relatively small data transfers will allow the drives to recover; very large transfers won't, and to make things worse, as the drive heats up, the transfer rate will often slow down because of the errors. That can be seen if you leave Activity Monitor open and watch the transfer rate over time (a method which Seagate tech support said was worthless because Activity Monitor was unreliable and GoFlex drives had no heat problem).
    If that's what's wrong, there really isn't any solution except using the smaller chunks of data which you've found works.

  • WebUtil_File.File_Open_Dialog VERY SLOW with large amounts of files

    Dear Friend,
    We are using WebUtil_File.File_Open_Dialog on the client to allow for the selection of an image from a folder on the server, which contains around 2-3000 files.
    It takes around 10-15 minutes to display the files within the folder. This is obviously unacceptable and I was wondering if anyone had come up with a viable solution to this problem, as i noticed similar issues had been posted in the past and no solution was suggested.
    The function call for the above operation is: :control.fichier := WebUtil_File.File_Open_Dialog(:global.path||'images\bdsurvey\', '', '*.jpg |*.jpg| (All files)*.* |*.*|', 'Nom du fichier photo');
    Many thanks,
    John

    This can be done by the following:
    Tuning the file system - reducing/filtering unwanted files, defragmentation, FAT (NTFS might consume more time)
    I am not sure whether you can do anything for this in AS.
    Alternatively there are some java classes which may be used instead of web util.

  • IWeb claiming to be missing a large amount of files.

    After experiencing errors when trying to publish iWeb to a local server, I first backed-up, then cleared the iWeb cache, preferences and Domain file after reading it might help. Ironically, now I have a much bigger issue: I returned a backed-up copy of the Domain file to Library>App support>iWeb, but upon opening iWeb back up again, I get a "The following errors occurred while trying to open this document" box with a long list of missing media files (.png, .jpg) along with templates, etc.
    I've checked random media files within the domain file and they all give the user read&write access. I even tried returning the preferences and cache .plist files to their rightful places. I've spent a day and a half now trying to figure this out but to no avail. I realize weening off of iWeb eventually is imminent, but for now I simply need the pages to be able to be edited in iWeb as before. Tranferring them to the 3rd party app I tried, Sandvox, strips many page elements.
    I tried loading the domain file onto iWeb on another machine and the same error occurs. This is some of what the log in Console is telling me:
    10/15/13 3:56:54.076 PM iWeb[92256]: <MM_MMTransaction: 0x1185ca60>(transactionID=CAAE0A18-9E85-4764-984F-D575B5CB44D5 uri=/internetservices/iapps/publishConfiguration08.plist transactionState=kDMTransactionHadError httpStatusCode=-1 contentLength=-1 bytesTransferred=-1 isSuccessful=NO isFinished=YES)
    10/15/13 3:56:54.512 PM iWeb[92256]: Error getting configuration data
    10/15/13 3:56:54.523 PM iWeb[92256]: Error getting configuration data
    10/15/13 3:57:47.387 PM iWeb[92256]: <MM_MMTransaction: 0x4a11960>(transactionID=BB085FEF-963B-4354-87C4-E8828ECD3D33 uri=/internetservices/iapps/publishConfiguration08.plist transactionState=kDMTransactionHadError httpStatusCode=-1 contentLength=-1 bytesTransferred=-1 isSuccessful=NO isFinished=YES)
    10/15/13 3:57:47.388 PM iWeb[92256]: Error getting configuration data
    10/16/13 11:00:51.445 AM iWeb[6181]: ImageIO: CGImageDestinationAddImage image parameter is nil
    10/16/13 11:00:51.445 AM iWeb[6181]: ImageIO: CGImageDestinationFinalize image destination does not have enough images
    10/16/13 11:00:51.464 AM iWeb[6181]: ImageIO: CGImageSourceCreateWithData data parameter is nil
    10/16/13 11:00:51.470 AM iWeb[6181]: ImageIO: CGImageSourceCreateWithData data parameter is nil
    Thank you in advance for any help whatsoever, I appreciate it.
    iWeb version 3.0.4; OS X 10.8.4

    Try the following:
    move the domain file from your Home/Library/Application Support/iWeb folder to the Desktop.
    launch iWeb, create a new test site, save the new domain file and close iWeb.
    go to the your Home/Library/Application Support/iWeb folder and delete the new domain file.
    move your original domain file from the Desktop to the iWeb folder.
    launch iWeb and try again.
    In Lion and Mountain Lion the Home/Library folder is now invisible. To make it permanently visible enter the following in the Terminal application window: chflags nohidden ~/Library and press the Return key - 10.7: Un-hide the User Library folder.
    To open your domain file in Lion or Mountain Lion or to switch between multiple domain files Cyclosaurus has provided us with the following script that you can make into an Applescript application with Script Editor. Open Script Editor, copy and paste the script below into Script Editor's window and save as an application.
    do shell script "/usr/bin/defaults write com.apple.iWeb iWebDefaultsDocumentPath -boolean no"delay 1
    tell application "iWeb" to activate
    You can download an already compiled version with this link: iWeb Switch Domain.
    Just launch the application, find and select the domain file in your Home/Library/Application Support/iWeb folder that you want to open and it will open with iWeb. It modifies the iWeb preference file each time it's launched so one can switch between domain files.
    WARNING: iWeb Switch Domain will overwrite an existing Domain.sites2 file if you select to create a new domain in the same folder.  So rename your domain files once they've been created to something other than the default name.
    OT

  • I have a large amount of files to be compressed...

    avi format that I'm changing to DV NTSC so that I can edit them in FCP. They all have the same output settings and same destination. Is there a way to apply the same settings to several files at once instead of having to drag them to each individual file? It gets really monotonous, REALLY quickly. I'm just trying to find a way to streamline it.
    Message was edited by: John Rose2

    Click the batch window to make it active, then select a clip (it should be highlighted with blue)
    Press Command A to select all, then drag a preset. If dragging doesn't work, with all of your clips selected, go to the "Target" menu in the menu bar, then choose "New Target With Setting...". This will open a dialogue box similar to the settings window, find your preset and click OK.

  • How to Number Large Amount of Files?

    Hi. I have a folder with about 600 funny photos that I randomly saved. And you know how pictures you save from sites have random long numbers as names?Well here is what I want: I want to know if there is a way to  automaticaly number all the photos starting from 1. Can you tell me if there is a way to do it that is already in Mac OS X Lion 10.7.3 or recomend some software please? Thank you.

    Nevermind, I just noticed you had your email address on your profile. I just sent you the file. It was created a while ago using an older version of Mac OS X - but still works. You're looking for the function 'make sequential'. Just use my file as a reference and I'm sure you'll figure it out yourself in no time. I created my template to automatically rename TV Shows and it actually uses a variable that I put in manually (automator doesn't support that kind of variable, you have to do it by hand in the code - but you can just ignore that for now).
    Hope this helps!

  • Why does safari 5.1.2 crash when  large amount of files are put in the "automatically import" folder

    I'm trying to consolodate a few small librarys. Condense and remove all the  lower res duplicates.
    Sometimes it works but safari crashes when i put more than a few files for import into  the automatically import foldwer that is inside the itunes media folder.  any ideas?  sometimes it leaves an empty artist name folders inside this auto import folder that seems to crash it or is the result of the crash?
    please help!
    Jason

    ALSO deleting PRefs doesnt help!--> FCP rescue and so on
    ....

  • Transferring large volume of files from mac to PC?

    Hi i have a mac with osx 10.4.8 and a pc, i need to transfer a large amount of files from mac to pc (around 200GB)
    Now i have lots of external HDs which are all used by either the PC or the macs. the ones formatted for the macs cannot be read atall by the PC without some expensive software, the PC formatted one i have appears to be readable on the mac, but when i try to copy the files onto it or change anything, i am not able to change/add anythingn due to permissions problems.
    Not sure what to do, i have quite a few HDs with plenty of sapce, but if they are all in the wrong format i cant really wipe/reformat them easily with files on them, nor do i want to buy yet another HDD just to transfer some files....
    Any ideas/advice?

    https://discussions.apple.com/docs/DOC-3003

  • File Bundle with large number of files failed

    Hi!
    Well, I thought there will appear problems. We do have some apps for distribution just by copying large amount of files (not large in size) to Windows (XP Pro, usually) machines. These is some programs which works from directory wo any special need for installation. Happy situation for admin. From one side. In ZfD 4.0.1 we did install this app on one of machines and then did take snapshot via special app (who remember) and did copy file to (Netware) server share, give rights for device (~ workstation) and associate it with ws via eDir and ... voila, next restart or whatsoever and app was there. Very nice, indeed, I miss this!
    So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry, bundle, upload files (first time it stuck, second time id accomplish, around 7500 files) and did distribution/launch association to ws (~device). And ... got errors. Several entries in log as examples below.
    Any ideas?
    More thanks, Alar.
    Error: [1/8/10 2:41:53 PM] BundleManager BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to process task: Novell.Zenworks.AppModule.LaunchException: Exception of type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION launchType, ActionContext context, ActionSetResult previousResults)
    Error: [1/8/10 2:41:54 PM] BundleManager ActionMan.FailureProcessingActionException Failed to process action: Information for id 51846d2388c028d8c471f1199b965859 has not been cached. Did you forget to call CacheContentInfo first?

    ZCM10 is not efficient in handling that number of files in a single
    bundle when they are in the content repo.
    Suggestions include zipping the files and uploading to the content repo
    and then downloading and extracting the zip as part of the bundle.
    Or Use the "Copy Directory" option to copy the files from a Network
    Source Directly like you did in ZDM.
    On 1/8/2010 8:56 AM, NovAlf wrote:
    >
    > Hi!
    > Well, I thought there will appear problems. We do have some apps for
    > distribution just by copying large amount of files (not large in size)
    > to Windows (XP Pro, usually) machines. These is some programs which
    > works from directory wo any special need for installation. Happy
    > situation for admin. From one side. In ZfD 4.0.1 we did install this app
    > on one of machines and then did take snapshot via special app (who
    > remember) and did copy file to (Netware) server share, give rights for
    > device (~ workstation) and associate it with ws via eDir and ... voila,
    > next restart or whatsoever and app was there. Very nice, indeed, I miss
    > this!
    > So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry,
    > bundle, upload files (first time it stuck, second time id accomplish,
    > around 7500 files) and did distribution/launch association to ws
    > (~device). And ... got errors. Several entries in log as examples
    > below.
    > Any ideas?
    > More thanks, Alar.
    > ---
    > Error: [1/8/10 2:41:53 PM] BundleManager
    > BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to
    > process task: Novell.Zenworks.AppModule.LaunchException: Exception of
    > type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    > at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION
    > launchType, ActionContext context, ActionSetResult previousResults)
    > ---
    > Error: [1/8/10 2:41:54 PM] BundleManager
    > ActionMan.FailureProcessingActionException Failed to process action:
    > Information for id 51846d2388c028d8c471f1199b965859 has not been cached.
    > Did you forget to call CacheContentInfo first?
    > ---
    >
    >

  • Redistributing Large Amounts of Media to Different Project Files

    Hello
    I'm working on a WWI documentary that requires an extremely large amount of Media. I was just wondering, if (besides exporting hundreds of QT files), perhaps just capturing all of this media, (spread out), into several FCP Project Files?
    In the screen shot below, note that there are 2 FCP projects open at one time. Why? Because I thought it might help re-distrtibuting all the captured media into more than just one FCP Project file. Is this advisable? Would creating more FCP files containing more and media actually be better than capturing soooo many captured archival clips into just one main FCP Project file?
    Or
    Should I be exporting 'Non-Self Contained' files to be re-imported back into the Project?
    Screen Shot;
    http://www.locationstudio.net/Redistribute-1.jpg
    Thanx
    Mike

    It is absolutely advisable. This keeps project sizes down to manageable levels. This is one of the tips I give in my tutorial DVD on Getting Organized with Final Cut Pro.
    Bear in mind, that if your footage and sequences are in different projects, you cannot match back to the clip in the Browser. Once a clip is cut into a sequence that resides in a different project, the link from that media and the original clip in it's original project is broken. That can be a pain, so I try not to break up projects unless I have to. And LARGE projects I find that I have to.
    ALthough it can still be a pain. When you need to find the footage that you know is in the same bin as that clip...and you don't know what that bin is? Sorry...match back doesn't work. So, just a caviat.
    Shane

  • Why each time I try to save my day's work,version WITH mark-ups and version w/o, does it say "you have placed a large amount of text on clipboard do you want to access this? What is clipboard and how can I just save my work in its proper file?

    I don't get this clipboard thing! I have created two files, one for the MS with mark-ups, and one for the unmarred version, each living in their own spot in my "house." It seems to be merging the two versions and I have to go in and re-paste and then it always says "you have placed a large amount of text on clipboard do you want to be able to access this?" and it prompts me to say no but I am afraid I'll lose my day's work so I just tap cancel. I highlight the day's revision and select "copy" to paste into the unmarked doc. before I save on the marked up or working doc. It is when I try to close that one that the clipboard issue pops up. What can you tell be about saving doc. in two places and how to NOT get it on clipboard? Thanks! I am not super computer savvy so layperson's language much appreciated!

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

Maybe you are looking for

  • Im using HP laserjet M1120N MFP printer and its required WIA driver.

    hi,  im using HP laserjet M1120N MFP printer. last time i dont have any issues to printer or scan, and lately i cant scan and only able to print out. its required me have WIA driver. anyone know how to fix its. thanks

  • Nokia 6700 supported by Nokia Multimedia Transfer

    Hello All, Please note that Nokia 6700 device is supported by nokia multimedia Transfer for mac http://europe.nokia.com/support/download-software/nokia-multimedia-transfer/download Download Nokia Multimedia Transfer Connect the device and select PC s

  • How can I save my contacts to other then the Outlook default

    I am running on Windows XP Professional with SP3 and Office 2007 at my office. I have an iPhone 3G. My Outlook runs on a server that allows a limited amount of space and I have a local mail folder as well. Since my Outlook size is limited, I have mos

  • Date and Time on photo's

    I thought it was a 'good idea' but now see that it was not! so can I remove the imprinted info in one go on all affected or do I have to edit every single one?

  • How to automatically close the client?

    Hi all, Is there any way to automatically close the client? Att, Renan Lopes