Generated Duplicate Files

We are maintaining a very large WebHelp project (over 1000
files in the WebHelp folder) and keeping the files in ClearCase.
Everytime the project is updated all of the files are treated as
though they have changed (whether they have or not). It takes a
couple of hours to check the files out and into ClearCase. What I'm
wondering is what files are actually updated (other than the
specifically affected HTML files). Are all files in the whdata,
whgdata, whxdata folders updated, only HTM files, JavaScript? All
JavaScript files in the main project directory? The CAB file? XML?
Any help would be greatly appreciated.

When RH finishes publishing it opens a window reporting how
many files were published and listing them. You can save the list
to a text file and/or print it.
Have you compared the source code from the old and new
versions and verified that RH republished a file that didn't
change?
A republished file is different in some way from the one in
the target directory.
In the whxxxx directories, RH should republish only the ones
that changed.
Search and Index terms are recompiled, but often some output
files are untouched, so RH doesn't republish those.
RH revises topic.htm files in ways that are not immediately
apparent. For example, each topic refers to its TOC label,
including its place in the book/sub-book structure, so a TOC
revision creates an updated file even though you left the
displayable contents as they were.
Have you tried deleting the single-source output setup, which
might have been corrupted, and creating a new one?
Harvey

Similar Messages

  • Duplicate File Handling Issues - Sender File Adapter - SAP PO 7.31 - Single Stack

    Hi All,
    We have a requirement to avoid processing of duplicate files. Our system is PI 7.31 Enh. Pack 1 SP 23. I tried using the 'Duplicate File Handling' feature in Sender File Adapter but things are not working out as expected. I processed same file again and again and PO is creating successful messages everytime rather than generating alerts/warnings or deactivating the channel.
    I went through the link  Michal's PI tips: Duplicate handling in file adapter - 7.31  . I have maintained similar setting but unable to get the functionality achieved. Is there anything I am missing or any setting that is required apart from the Duplicate file handling check box and a threshold count??
    Any help will be highly appreciated.
    Thanks,
    Abhishek

    Hello Sarvjeet,
    I'd to write a UDF in message mapping to identify duplicate files and throw an exception. In my case, I had to compare with the file load directory (source directory) with the archive directory to identify whether the new file is a duplicate or not. I'm not sure if this is the same case with you. See if below helps: (I used parameterized mapping to input the file locations in integration directory rather than hard-coding it in the mapping)
    AbstractTrace trace;
        trace = container.getTrace();
        double archiveFileSize = 0;
        double newFileSizeDouble = Double.parseDouble(newFileSize);
        String archiveFile = "";
        String archiveFileTrimmed = "";
        int var2 = 0;
        File directory = new File(directoryName);
        File[] fList = directory.listFiles();
        Arrays.sort(fList, Collections.reverseOrder());
        // Traversing through all the files
        for (File file : fList){   
            // If the directory element is a file
            if (file.isFile()){       
                            trace.addInfo("Filename: " + file.getName()+ ":: Archive File Time: "+ Long.toString(file.lastModified()));
                            archiveFile = file.getName();
                          archiveFileTrimmed = archiveFile.substring(20);       
                          archiveFileSize = file.length();
                            if (archiveFileTrimmed.equals(newFile) && archiveFileSize == newFileSizeDouble ) {
                                    var2 = var2 + 1;
                                    trace.addInfo("Duplicate File Found."+newFile);
                                    if (var2 == 2) {
                                            break;
                            else {
                                    continue;
        if (var2 == 2) {
            var2 = 0;
            throw new StreamTransformationException("Duplicate File Found. Processing for the current file is stopped. File: "+newFile+", File Size: "+newFileSize);
    return Integer.toString(var2);
    Regards,
    Abhishek

  • Issues Missing files & duplicate files and not enough screen space

    Hello everyone,
    I'm using a 1.67 Ghz 15 in tibook with 1 gig of ram
    I had issue yesterday, missing files, I was in the rating/keywords panel scrolling through my images, one was a bit soft so I deleted it using the command delete combo. Somehow all of the images in the project disappeared. They hadn't been moved to the trash, I looked, and looked and looked, I couldn't find them using a search, and it should have been easy, they were the first images that had been imported into Aperture.
    So since I had copied the 70 files to my hardrive first I re-imported them. Couldn't see those either.
    Deleted the project and re-imported again, worked fine, and I couldn't reproduce the problem.
    When I went to export five images, (all the images had been re-named with a sequential suffix, Smith_01, Smith_02, Smith_03...), to my desktop so I could copy to our server, I remembered that I needed to update a caption on one image. (Could that caption box be any smaller?), anyway I exported again, I meant to only export the individual image but exported all five again. Instead of telling me there were duplicate files on the desktop it increased the sequential number of each duplicate file by 1. So now instead of one file of Smith_01, Smith _08, Smith 33, etc, I have the original and the Smith_01 file now has a matching image slugged Smith_02 on the desktop as well, along with all the others that had a match with a suffix one digit higher. And of course back in the project there is a Smith_02 that is different..
    If I were to copy different files with the same name to other folders or our archive server one of those images could be easily discarded or copied over by the file with the duplicate name.
    And lastly, when importing I tried to rename my files and add a generic caption, I imported the same images a few times, the first time I couldn't scroll down to see the caption box and metadata, the second time I could see the scroll and add the metadata below. The third time I couldn't scroll down. Very Wierd!
    And lastly, is there any way to show just the adjustments or just the metadata boxes on the Adjustments & Filters layout? The boxes are so small already I'd like to be able to at least use the whole side of the screen for one box. I'd really like to be able to put them where I want them. The adjustment HUD works this way on a full screen but I want to get rid of the strip of thumbnails in the full screen and just have the HUD and the image, nothing else.
    Any insights would be great.
    Thanks,
    Craig

    Hello everyone,
    <...>
    I had issue yesterday, missing files, I was in the
    rating/keywords panel scrolling through my images,
    one was a bit soft so I deleted it using the command
    delete combo. Somehow all of the images in the
    project disappeared. They hadn't been moved to the
    trash, I looked, and looked and looked, I couldn't
    find them using a search, and it should have been
    easy, they were the first images that had been
    imported into Aperture.
    I think they were in the trash - as far as I know search does not work within the trash itself.
    Nope not in the trash, that's the first place I looked, I also found that deleting a project is the fastest way to export the masters. Just pull them out of the trash can
    They should have been moved to the trash folder, inside an Aperture directory, and then from there inside of a folder with the name of the project the image was in.
    Alternatley (and I'm not sure how this migh thave happened) they may have been marked as "rejected". All views have a default search >>criteria that starts with "unrated or higher" but rejected photos are lower than this and are not seen. By default there is a smart folder at the >>top of your library that shows Unrated pictures, you might check to see if they are there.
    I'll double check the shortcut for rejects, this is the only plausible explanation, but it doesn't explain why the second import of images couldn't be found either.
    So since I had copied the 70 files to my hardrive
    first I re-imported them. Couldn't see those
    either.
    Deleted the project and re-imported again, worked
    fine, and I couldn't reproduce the problem.
    Well, I'm sorry to say I have no theories to offer there. Just glad it worked eventually!
    When I went to export five images, (all the images
    had been re-named with a sequential suffix, Smith_01,
    Smith_02, Smith_03...), to my desktop so I could copy
    to our server, I remembered that I needed to update a
    caption on one image. (Could that caption box be any
    smaller?), anyway I exported again, I meant to only
    export the individual image but exported all five
    again. Instead of telling me there were duplicate
    files on the desktop it increased the sequential
    number of each duplicate file by 1. So now instead
    of one file of Smith_01, Smith _08, Smith 33, etc,
    <...>
    Yes, Aperure will refuse to overwrite any file ever - and will always add those other numbers if it encounters a file with the same name as the one it is trying to generate!
    I agree that this is a good thing, but there should be an alert of some kind if you have a duplicate. The finder certainly tells you if you are trying to copy a duplicate file to the same location. Adding an increment of one to the sequential suffix without an alert is very poor planning. On deadline trying to get files into our system I could have easily overlooked the duplicates and overwritten other files with the same name.
    The only way to work around that is either to delete all files from the directory before you export from Aperture, or to export to an empty directory and copy them over yourself.
    The work around is easily figured out, but like some other things I shouldn't have to work around it.
    <...>
    And lastly, is there any way to show just the
    adjustments or just the metadata boxes on the
    Adjustments & Filters layout? The boxes are so small
    already I'd like to be able to at least use the whole
    side of the screen for one box. I'd really like to
    be able to put them where I want them. The
    adjustment HUD works this way on a full screen but I
    want to get rid of the strip of thumbnails in the
    full screen and just have the HUD and the image,
    nothing else.
    Well, another option there would be to drag the divider between the metadata and the adjustments all the way up (as much as you can >>anyway, the histogram will still be visible).
    That would be wonderful, but I'm on a 15 in ti book and there isn't enough space to do that. The divider show up but you can't move it. Ideally you should be able to just view the metadata or just the adjustments if you so choose. Binding them together is stupid, if I'm making adjustments, I don't care what is happening in the metadata screen and when I'm adding captions and changing metadata I don't need to see the adjustments screen. Give me a metadata template like the IPTC screen in Photomechanic that I can save and apply and put it on the screen where I want it.
    Then to see adjustments just use the adjustment HUD - key "H" to bring up or dismiss. It will stay where you put it when you make it go >>away and come back and you can have it whereever you like on the screen unlike the inspector.
    I'd love to do that but it can't be done on a 15 in laptop screen.
    For maximum space value, you could arrange the Adjustments HUD to exactly overlay the space where the Inspector is - then you can press >>"I" to make the inspector come and go, and "H" for the Adjustments Hud all in the same space.
    For maximum space on adjustments I'm using the full screen with the HUD. At least I can then move the HUD where I want it to be, not anchored on the right side.
    In general look over the keyboard shortcuts as there are a lot of keys to make panels come up and go away quickly (one key, no modifiers) >>which really helps to maximize use of screen space.
    I'm still learning them and I'm sure that will help, but the adjustments panel and the metadata panel have to be changed to make it more usable on a laptop in my opinion. Maybe it's fine on a large monitor, I wouldn't know as I can't load it on my G5 desktop without a hack. And so far I'd rather use Photomechanic for sorting and captioning it's much much faster for newspaper work.

  • Report Generates Duplicate Rows

    Each,
    I have a report which when generated and run as 9i report on Linux generates duplicate rows. Whereas the same report running on 6i client server works perfectly fine. Running against the exact same database.
    The report generates a delimited CSV file.
    On Client Server
    Reports 6.0.8.12.1
    Windows XP
    On 3 tier
    Reports 9.0.2.3.0
    Red Hat Linux
    Database 9.2.0.4.0
    Examples Here. This line generated in client server.
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     0     0     0     0
    Yet on 3 tier 9i. Also note the asterisk.
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Feb-06     ELDC     DATA TRANSFER     0     0     0     0     0     **     0     0     0
    Any ideas what is happening here.

    There are two ways you can work around this bug.
    1. Layout Model/Main Section\Property Palette --Set Report Height - 9999.
    Do this after you complete creating the layout. After completing the development of the report just change the report height to 9999
    2. Set page height to 0 in the report .prt file and use that file name in system paratmeters. ( desformat - *.prt file ) You have to copy this file to the server if you are running web reports. This will take care of all your delimited reports.
    To get rid of the asterisk try adding to_char for all the amount columns.
    Also, the font on Red Hat Linux may not be an exact match to the font in the report.

  • DME - Duplicate File

    Hi Guys
    While running the RFFOUS_T program, there were some memory issue in our production system and it got cancelled. In the spool ( thru SM37) I see data of 2 employees with their bank and amount information.
    When we fixed the memory issue and re-ran the DME program ( RFFOUS_T), it created the file and the bank also received the ACH file. But when i tried to see the spool thru SM37, i see nothing and get the warning message that there may be duplicate files.
    my question:-
    1. WHen DME was run 1st time, where does it create a flag that it was run 1st time. I think I should have either unchecked the Flag before running the DME 2nd time so that spool could have been created. Or, how could have I deleted the truncated file that was created when DME was run for the 1st time.
    2. Is there a way now to get the spool for the DME when it ran successfully in the 2nd attempt.
    3. If DME does not run successfully in the 1st attemp, what actions should have I taken so that when DME was run in the 2nd attempt, it would have created SPOOL ( SM37).
    I appreciate your suggestions / help.
    W.

    This may not have anything to do with DME being run more than once. Regardless of whether you write file to application server or to Temse, the DME file name should be unique as date and time stamp are part of the file name.
    This might just be a BASIS issue where spool file names being duplicated due to spool file number (number range generated) running out of number (I remember running into duplicate spool files a year or so ago).

  • Application running in Solaris 9 container generating core files. what to do?

    My solaris 9 zone configuration in solaris 10 looks like:
    zonecfg:sms> info
    zonename: sms
    zonepath: /zone/sms
    brand: solaris9
    autoboot: true
    bootargs:
    pool:
    limitpriv: default,proc_priocntl,proc_clock_highres,proc_lock_memory,sys_time,priv_proc_priocntl,priv_sys_time,net_rawaccess,sys_ipc_config,priv_proc_lock_memory
    scheduling-class:
    ip-type: exclusive
    hostid:
    [max-shm-memory: 4G]
    [max-shm-ids: 100]
    [max-sem-ids: 100]
    fs:
      dir: /var
      special: /dev/dsk/c1t0d0s5
      raw: /dev/rdsk/c1t0d0s5
      type: ufs
      options: []
    net:
      address not specified
      physical: bge0
      defrouter not specified
    device
      match: /dev/dsk/c1t0d0s5
    device
      match: /dev/rdsk/c1t0d0s5
    device
      match: /dev/dsk/c1t0d0s6
    device
      match: /dev/rdsk/c1t0d0s6
    device
      match: /dev/dsk/c1t0d0s7
    device
      match: /dev/rdsk/c1t0d0s7
    capped-cpu:
      [ncpus: 2.00]
    capped-memory:
      physical: 4G
      [swap: 8G]
      [locked: 2G]
    attr:
      name: hostid
      type: string
      value: 84b18f64
    attr:
      name: machine
      type: string
      value: sun4u
    rctl:
      name: zone.max-sem-ids
      value: (priv=privileged,limit=100,action=deny)
    rctl:
      name: zone.max-shm-ids
      value: (priv=privileged,limit=100,action=deny)
    rctl:
      name: zone.max-shm-memory
      value: (priv=privileged,limit=4294967296,action=deny)
    rctl:
      name: zone.max-swap
      value: (priv=privileged,limit=8589934592,action=deny)
    rctl:
      name: zone.max-locked-memory
      value: (priv=privileged,limit=2147483648,action=deny)
    rctl:
      name: zone.cpu-cap
      value: (priv=privileged,limit=200,action=deny)
    Solaris 9 zone /etc/system file looks like:
    * The directive below is not applicable in the virtualized environment.
    * The directive below is not applicable in the virtualized environment.
    * The directive below is not applicable in the virtualized environment.
    * The directive below is not applicable in the virtualized environment.
    * The directive below is not applicable in the virtualized environment.
    * The directive below is not applicable in the virtualized environment.
    set noexec_user_stack=1
    set semsys:seminfo_semmni=100
    set semsys:seminfo_semmns=1024
    set semsys:seminfo_semmsl=256
    set semsys:seminfo_semvmx=32767
    set shmsys:shminfo_shmmax=4294967295
    set shmsys:shminfo_shmmin=1
    set shmsys:shminfo_shmmni=100
    set shmsys:shminfo_shmseg=10
    set rlim_fd_max=65536
    set rlim_fd_cur=60000
    * The directive below is not applicable in the virtualized environment.
    My questions are:
    1. Application running in solaris 9 container generating core files. what to do???
    2. My prstat -Z for zone shows almost 95% percent cpu usage. what to do???
    3. Kindly can share how to move solaris 9 into solaris 10 containers ??

    Based on the new questions for the same post you wrote in the other communities, some posts are removed as duplicate, here is the answer :
    For the point #3, please look on table 17-1 in the following URL :
    Zone Components - System Administration Guide: Oracle Solaris Containers-Resource Management and Oracle Solaris Zones
    You can also customize your container /etc/system file but it cannot exceeds the global zone and the zone configuration value.
    For the other point, #2, this can be complicated without a complete image of what the complete system do.
    Try fist to check if you have not a busy process in your zone, then you can check if a bottleneck exists in the I/O side. You use maybe wrong parameters, a wrong configuration or your system configuration is insufficient in term of resources.
    What I can see in the outputs that you provided is that the S9 zone uses the half of the swap space. This can impact your zone performance and I/O activity, and can have in this case a side effect on some processes. Check why your zone uses the swap and how you can remedy this.

  • How to delete duplicate files and fix borken links

    Problem: Duplicate Files and Broken Links
    Involves 2 IBM ThinkPads (1) one main partition (system driveC:) running Windows XP SP3, (2) running Windows 7 and with two partitions  - Drive C: has the library while drive D:contains the “itunes media” folder.
    Issues:
    On both laptops, the iTunes media folder contains duplicatecopies of purchased songs and movies (some purchased prior to iTunes 9),imported or ripped songs from CDs, and redeemed digital copies of movies. EverythingI have tried results in broken links and duplicate files. Is there a method or aprogram which will allow me to disentangle this mess? I have spent hoursdeleting files, not using the iTunes “copy” function and so on but the problemcontinues. At times I simply run out of disc space and simply have to start theprocess all over again. I have spoken to someone in Apple’s iTunes tech supportand was told this was a “windows” problem?!?
    Any suggestions or advice that would help me to untanglethis mess would be greatly appreciated. Is taking the ThinkPads to a PC repairshop the only answer? How would they fix the problem?
    Thanks all, PrimeSequence.

    PrimeSequence wrote:
    Thanks 'turingtest2' for your suggestion. I know the location of the files, but iTunes keeps duplicating them as soon as I launch iTunes.
    You know the location of your file, but it sounded like iTunes doesn't. The FindTracks script is designed to fix the broken links.
    I doubt uninstalling and reinstalling will change anything. Uninstalling iTunes does not affect the contents of your media library or media folders, the files stay put.
    iTunes shouldn't generate much in the way of new files just through launching it... Exceptions include artwork files for tracks it knows it has yet to scan and any downloads related to podcast subscriptions.
    The only other thing I can think of would be some sort of error with the Automatically add to iTunes folder. Any files that are in there should be moved to their normal locations within the media folder layout, not copied, and then added to the library. I suppose the processing order could be add, then move, and if for some reason the move fails then next time you start up iTunes they could be added again. So do you have any files in the Automatically add to iTunes folder? If not is there any other pattern you can see to this duplication? Do you have any software that tries to add files to iTunes automatically?
    Actually, and in a similar vein, does this duplication happen on both machines or just the Windows 7 one? Vista introduced new "default" locations for user profiles plus a bunch of symbolic links that automagically redirect attempts by older software to look for things in the default XP locations. As a result your Music folder, for example, can appear at multiple locations within the tree of folders on your C: drive. I suppose it is just possible that some form of recursive nightmare could be going on there.
    FWIW it is much easier to manage a self-contained library where the media folder is inside the library folder.
    tt2

  • Problem with duplicate files being created

    I'm having a problem with PSE7 copying files from my camera card to my computer.
    It will copy over all new jpg files fine but once it gets to the NEF (RAW)  files it will not only copy the new files from today but recopy all  those NEF files from other days that are already downloaded so that I end up with files like this:
    DSC_2200.NEF
    DSC_2200-1.NEF
    DSC_2200-2.NEF and so on.  It's very annoying and I have to repeatedly  delete all the duplicate files and this takes up time. I do use the  global search function in Windows to delete them but would prefer they  don't do this in the first place. They do eat up a lot of space.
    I have the settings set to only copy over new files in PSE so why is it copying over older files? Does anyone have any clues?

    You can set up a scan when you create a new device in System Preferences:
    go to System Preferences --> FCSVR Pref pane --> Devices Pane and then choose create new device ("+" sign and the Device Assistant will appear. You choose the type and then you can set up a Full or Add only Scan here.)
    Other option is to set up a Scan via the Administration window in the Java Client:
    This is not as easy since there is no Device Assistant.
    go to Admin window in Java Client --> Response and then create a new 'Scan' response on the device you want to scan. 'Scan Productions' is something different, maybe you want that instead, I don't know. Depending on how you set it up, it can create a Production in FCSVR catalogue for each folder or subfolder and that media will be scanned and placed into that Production in FCSVR catalogue.
    Anyways, once you create the Scan Response, go to Admin pane --> Schedule. Create a new schedule and add the 'Scan' response you just created to the "Response List" section. Don't forget to check "Enabled"...can't tell you how many times I created the response but then forgot to enable it.
    Once you scan the assets, they are in the FCSVR catalogue with clip proxies, thumbnails and posterframes. If you have existing FCP projects that use this media, you will want to make sure the media in the FCP project is connected to that same media that was scanned. When you upload the FCP project, it will not dup the assets, just add the FCP project to it, unless you didn't set the EIP device correctly. If you look in Search All Jobs, the only thing that should be generated at this point it Edit Proxies (if you enabled them) and Elements.
    Now my question to you is the same as Chris' question here http://discussions.apple.com/thread.jspa?messageID=9147105#9147105. How did you set things up so far? What is your workflow? Where is your media?

  • Generating Duplicate Columns Names in CMP's

    Hi,
    We are having trouble with the automatic key generation of the CMP beans. We are using java.lang.Object as primary key class and the autogenerated col is ejb_pk but still it doesnt work.
    The Generated Class files show the insert and update statements with duplicate column names.
    The exception trace is
    <i>
    Caused by: com.sap.engine.services.ejb.exceptions.BaseEJBException: SQLException while the data is being flushed.
    The persistent object is com.ejb.PriorityCMP220Persistent.
    at com.sap.engine.services.ejb.entity.pm.UpdatablePersistent.ejbFlush(UpdatablePersistent.java:106)
    at com.sap.engine.services.ejb.entity.pm.TransactionContext.flushAll(TransactionContext.java:380)
    at com.sap.engine.services.ejb.entity.pm.TransactionContext.flush(TransactionContext.java:343)
    at com.sap.engine.services.ejb.entity.pm.TransactionContext.beforeCompletionTransactionContext.java:446)
    at com.sap.engine.services.ts.jta.impl.TransactionImpl.commit(TransactionImpl.java:230) ... 50
    more Caused by: java.sql.SQLException: [SAP_Portals][SQLServer JDBC Driver][SQLServer]Column name 'PRIORITY_ID' appears more than once in the result column list. </i>
    We are using p9 sqlserver driver . Is it something related to driver problem or some configuration issue.
    Please Suggest,
    Thanks,
    Mukul.
    Edited by: Mukul Mangla on Mar 25, 2009 2:03 PM

    As stated by Ken, the Organizer does not consider duplicate file names as duplicate files : the full path with folder hierarchy is taken into account. I have the same situation than you with my camera limited to 10 000 ... I am working with this fact since 1999, and I have no organization problem with that. Absolutely no necessity to rename in my case, but you may have other requirements ?
    If so, it's good to know that you can use the downloader to rename at the import stage (hence the questions of Ken about the way you import).
    You mentionned batch renaming : that can only be made from within the organizer, and I would seriously recommend thinking twice before deciding to do so, a full backup before that being a must.

  • Specific duplicate file issue

    Hi, I have many thousands of files on several hard drives, which have been backed up and duplicated over the years, sometimes to different directories and sometimes with file names changed.
    Is there an easy way, or a tool, that can assure me quickly that "all files in directory X are also present in directory Y"?
    Because there are so many files involved, I'd prefer NOT to use a program like Find Duplicate Files, for example, because it would just generate a huge long list of duplicates. I already KNOW there are many, many duplicate files, I just want to know, is there anything, say, on the disc that ISN'T a duplicate? (So then I could wipe the drive without worrying that I've lost a lone copy of one tiny file.)
    In a way, I guess what I'd like is a "reverse duplicate finder." Assume these are all duplicates, but what ISN'T?
    Thanks in advance for any tips!
    Steve

    If your not afraid of the command line, rsync can do what you ask, it can even do "dry runs"
    to test the water. It can check files using checksum methods and "fuzzy" logic, allowing you to find
    renamed files.
    from the rsync Man Page:
    DESCRIPTION
    Rsync is a fast and extraordinarily versatile file copying tool.
    It can copy locally, to/from another host over any remote shell,
    or to/from a remote rsync daemon. It offers a large number of
    options that control every aspect of its behavior and permit very
    flexible specification of the set of files to be copied. It is
    famous for its delta-transfer algorithm, which reduces the amount
    of data sent over the network by sending only the differences
    between the source files and the existing files in the destina-
    tion. Rsync is widely used for backups and mirroring and as an
    improved copy command for everyday use.
    Rsync finds files that need to be transferred using a "quick
    check" algorithm (by default) that looks for files that have
    changed in size or in last-modified time. Any changes in the
    other preserved attributes (as requested by options) are made on
    the destination file directly when the quick check indicates that
    the file's data does not need to be updated.
    http://www.innovatingtomorrow.net/2008/01/10/using-rsync-backup-files-basics
    A GUI app that is very capable of doing advanced file backup chores is Chronosync. It has
    many rsync like features built in.
    http://www.econtechnologies.com/pages/cs/chrono_overview.html
    Kj

  • 8g touch updated 4.2.1, problems synching apps and music. gives multiple errors; device times out, internal device errors, network times out, duplicate file, too many files open. itunes diagnostics ok. what to do?

    have an 8g ipod touch which has had problems trying to play games and downloading update 4.2.1. went to the apple store who kindly downloaded the software update but now can't synch and generates multiple error messages; device times out, internal device error, network times out, duplicate file and too many files open etc etc. have latest i tunes download and diagnostic test ok. what can i do?

    Does the ext directory have the php_oci8.dll? In the original steps the PHP dir is renamed. In the given php.in the extension_dir looks like it has been updated correctly. Since PHP distributes php_oci8.dll by default I reckon there would be a very good chance that the problem was somewhere else. Since this is an old thread I don't think we'll get much value from speculation.
    -- cj

  • Duplicate File Format options in Save As Dialog

    I am getting duplicate file format options in my save as dialog. I am running photoshop 12.0.1 on mac os x 10.6.4.

    Thanks for the answer, I have the same problem, happened after I uninstalled CS4, (using cs5 now) however I can't seem to find where the duplicates are located. Any tips as to where I should look?
    Thanks in advance!

  • Reports 6.0 and Parameter Lists and Generate to File

    I am using the run_product built in from Forms 6.0 and opening
    up a report passing it several parameters via a parameter list.
    Everything works great when previewing the report.
    There is the option in the report preview under File -> Generate
    to File. When I generate a report to file using any type of
    format it appears that the report does not use the parameters
    that I passed in originally from the form. It appears that it
    looses all the parameters I passed in. This is most concerning
    to me. Am I doing something wrong or is this a "feature" I
    didn't know about? I really would like users to have this
    ability.
    null

    Yes I guess this will work, but the option to generate to file
    is extremely misleading if you ask me. This option should
    generate the current report with the current parameters. This
    is unacceptable as far as I am concerned and should be
    considered a bug. Oracle needs to give us more control over
    FORMS and REPORTS into all too many situations I have been
    frustrated because I am not able to do something that I want to
    do.
    I feel in general REPORTS object is very limited compared to
    crystal reports....
    Dan Paulsen (guest) wrote:
    : Give the user the option on the calling form whether to save
    the
    : report to file or just view it. If they want to save to file,
    : pass the parameter to save to file when you call the report
    and
    : suppress the parameter form, this will eliminate the problem.
    : Spencer Tabbert (guest) wrote:
    : : I am using the run_product built in from Forms 6.0 and
    opening
    : : up a report passing it several parameters via a parameter
    : list.
    : : Everything works great when previewing the report.
    : : There is the option in the report preview under File ->
    : Generate
    : : to File. When I generate a report to file using any type of
    : : format it appears that the report does not use the
    parameters
    : : that I passed in originally from the form. It appears that
    it
    : : looses all the parameters I passed in. This is most
    : concerning
    : : to me. Am I doing something wrong or is this a "feature" I
    : : didn't know about? I really would like users to have this
    : : ability.
    null

  • Adogjf Unable to generate jar files under JAVA_TOP error while patching

    Hi,
    I'm upgrading my version of Java from 1.3.1_19 to 1.5.0_17. I downloaded the jdk-1_5_0_17-linux-i586.bin file and unpackaged it to /u0/<SID>/<SID>db/10.2.0. It unpackaged it to a new directory jdk1.5.0_17 (/u0/<SID>/<SID>db/10.2.0/jdk1.5.0_17), whereas the old version is in a directory jdk (/u0/<SID>/<SID>db/10.2.0/jdk).
    Having done that I had to run an interoperability patch (4372996). Upon running this patch I received an error at the end:
    adogjf() Unable to generate jar files under JAVA_TOP
    There were no other errors.
    I think the problem has to do with where my new java version is located but I'm not sure. I'm thinking that it's still looking under the old JAVA_TOP (/u0/<SID>/<SID>db/10.2.0/jdk) when I want it to look under the new JAVA_TOP (/u0/<SID>/<SID>db/10.2.0/jdk1.5.0_17). So, should I move the contents from the new directory (jdk1.5.0_17) to the old directory (jdk) or is there a way to 'point' it to the new JAVA_TOP.
    Please let me know if I'm totally off base and there's another solution. We're currently on EBS 11.5.10.2, DB 10.2.0.4 and RHEL 4 Update 5.
    Thanks,
    Lia.

    Here you go. Hope this helps you to help me :-). Thanks.
    +
    ** Generating the product JAR files...
    STRT_TASK: [Generate JAR files] [] [Fri Mar 27 2009 09:20:32]
    STRT_TASK: [Generate JAR files under JAVA_TOP] [] [Fri Mar 27 2009 09:20:32]
    Signing product JAR files in JAVA_TOP -
    /u0/mary/marycomn/java
    using entity Customer and certificate 1.
    Calling /u0/mary/marycomn/util/jre/1.1.8/bin/jre ...
    Successfully created javaVersionFile.
    Generating product JAR files in JAVA_TOP -
    /u0/mary/marycomn/java with command:
    adjava -mx512m -nojit oracle.apps.ad.jri.adjmx @/u0/mary/maryappl/admin/mary/out/genjars.cmd
    Reading product information from file...
    Reading language and territory information from file...
    Reading language information from applUS.txt ...
    Temporarily resetting CLASSPATH to:
    "/u0/mary/maryappl/ad/11.5.0/java/adjri.zip:/u0/mary/marycomn/util/jre/1.1.8/lib/rt.jar:/u0/mary/marycomn/util/jre/1.1.8/lib/i18n.jar:/u0/mary/marycomn/util/jre/1.1.8/lib/tools.jar:/u0/mary/marycomn/java/appsborg.zip:/u0/mary/marycomn/java/apps.zip:/u0/mary/maryora/8.0.6/forms60/java:/u0/mary/marycomn/java"
    Calling /u0/mary/marycomn/util/jre/1.1.8/bin/jre ...
    The JDK version is 1.1.8
    Validating the files/directories specified for -areas option
    Validating the files/directories specified for -outputSpec option
    Validating the directory specified for -lstDir option
    About to Analyze the input areas : Fri Mar 27 2009 09:20:36
    WARNING: Will not load stale resource unit META-INF/services/javax.xml.parsers
    WARNING: Will not load stale resource unit META-INF/services/javax.xml.transform
    Done Analyzing the input areas : Fri Mar 27 2009 09:20:43
    About to Analyze/Generate jar files : Fri Mar 27 2009 09:20:43
    About to Analyze fndnetcharts.jar : Fri Mar 27 2009 09:20:43
    Up-to-date : fndnetcharts.jar
    Done Analyzing fndnetcharts.jar : Fri Mar 27 2009 09:20:44
    About to Analyze fndtdg.jar : Fri Mar 27 2009 09:20:44
    Up-to-date : fndtdg.jar
    Done Analyzing fndtdg.jar : Fri Mar 27 2009 09:20:44
    About to Analyze fndjgl.jar : Fri Mar 27 2009 09:20:44
    Up-to-date : fndjgl.jar
    Done Analyzing fndjgl.jar : Fri Mar 27 2009 09:20:45
    About to Analyze fndjle.jar : Fri Mar 27 2009 09:20:45
    Up-to-date : fndjle.jar
    Done Analyzing fndjle.jar : Fri Mar 27 2009 09:20:47
    About to Analyze fndlrucache.jar : Fri Mar 27 2009 09:20:47
    Up-to-date : fndlrucache.jar
    Done Analyzing fndlrucache.jar : Fri Mar 27 2009 09:20:47
    About to Analyze fndgantt.jar : Fri Mar 27 2009 09:20:47
    Up-to-date : fndgantt.jar
    Done Analyzing fndgantt.jar : Fri Mar 27 2009 09:20:47
    About to Analyze fndpromise.jar : Fri Mar 27 2009 09:20:47
    Up-to-date : fndpromise.jar
    Done Analyzing fndpromise.jar : Fri Mar 27 2009 09:20:48
    About to Analyze fndforms.jar : Fri Mar 27 2009 09:20:48
    About to Generate fndforms.jar : Fri Mar 27 2009 09:20:49
    Done Generating fndforms.jar : Fri Mar 27 2009 09:20:50
    About to Sign fndforms.jar : Fri Mar 27 2009 09:20:50
    ERROR: Javakey subcommand exited with status 1
    Javakey standard output:
    Adding entry: META-INF/MANIFEST.MF
    Copyright (c) 2002 Oracle Corporation
    Redwood Shores, California, USA
    AD Java Key Generation
    Version 11.5.0
    NOTE: You may not use this utility for custom development
    unless you have written permission from Oracle Corporation.
    Javakey error output:
    Reading product information from file...
    Reading language and territory information from file...
    Reading language information from applUS.txt ...
    Successfully created javaVersionFile.
    Customer not found in database.
    java key error:
    adjava -ms128m -mx256m sun.security.provider.Main -gs /u0/mary/marycomn/java/oracle/apps/fnd/jar/fndforms.jar.tmp /u0/mary/marycomn/java/oracle/apps/fnd/jar/fndforms.jar.uns
    The above Java program failed with error code 1.
    Done Analyzing/Generating jar files : Fri Mar 27 2009 09:20:51
    AD Run Java Command is complete.
    Copyright (c) 2002 Oracle Corporation
    Redwood Shores, California, USA
    AD Java
    Version 11.5.0
    NOTE: You may not use this utility for custom development
    unless you have written permission from Oracle Corporation.
    Failed to generate product JAR files in JAVA_TOP -
    /u0/mary/marycomn/java.
    adogjf() Unable to generate jar files under JAVA_TOP
    AutoPatch error:
    Failed to generate the product JAR files
    You should check the file
    /u0/mary/maryappl/admin/mary/log/4372996.log
    for errors.
    +

  • Donu00B4t load duplicates File to a cube through DTP

    Hi gurus... i want to know if is possible to avoid load the same data from files to a cube.. doing this trough a DTP.
    I think to do this with a step in a process chain that eliminate dupplicates entry in the cube, but the problem is that i haven´t any selection in DTP... this is a full load.
    Regards..

    Neo,
    do you want to avoid loading duplicate files to a cube using  DTP or avoid loading duplicates from the file to the cube ?
    if it is avoid loading duplicate records from file to cube - then ideally you can use a DSo in between...

Maybe you are looking for

  • Not able to display flat file's error line in output

    The below Source Code is BDC for XD01 (update Customer Master Record) in CALL TRANSACTION method. This program is written in call transaction method cause as per user’s requirement, user wants error log to be printed as soon as the BDC process finish

  • Java and ABAP system, setting new password rules

    Hello, we have a ecc 6.0 and in front for the users we have a portal based on netweaver 7.0. Our security colleagues told us, to change the password rules based on our new security book. the portal is getting the users via ume and the datasource is t

  • How to set wallpaper in Xmonad

    Hi, I have installed Xmonad from AUR, as mentioned here: https://wiki.archlinux.org/index.php/Xmonad, and I am trying to set my wallpaper, as mentioned in here: http://xmonad.org/tour.html , However, my installation doesn't have xpmroot command. How

  • How to check an alert or LOV is displayed at Forms

    Is there any other way , shall we identify  any alert/Message/LOV is opened ? Please advise on this.

  • Skype 7.3 keeps crashing

    Hello, Ever since the last update my Skype keeps crashing and freezing and I can barely log in. Whenever I try to type out or load messages skype will either crash or freeze up and sometimes it crashes my computer along with it. I've tried deleting m