File SIZES are Different ?

Hi Experts,
I have implemented ESR By pass Scenario.
The problem is we are getting the file sizes are differently in Source folder and Target folder
How can we resolve this issue
Please Guide Me
Thanks and Regards,
Ravi Teja.

Refer to question no. 2 under the below wiki and see if that configuration helps.
Sender File Adapter Frequently Asked Questions - Process Integration - SCN Wiki

Similar Messages

  • I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?

    I have a few hundred duplicates in my iPhoto library, but the file sizes are different.  So one is 1.3mb and one is 567kb.  I want to delete the smaller ones, but short of comparing each duplicate, is there a way to do this?  I've been looking at Duplicate Annhilator but I don't think it can do it.
    Thanks!

    I just ran a test with iPhoto Library Manager, Duplicate Annihilator, iPhoto Duplicate Cleaner, Duplifinder and Photodedupo.  I imported a folder of 5 photos into a test library 3 times, allowing iPhoto to import duplicates.  I then ran the 5 photos thru resizer to reduce their jpeg compression but all other aspects of the file the same.
    None of the duplicate removal apps found set that was reduced in the file resizer. That's probably due to the fact that the file creation date was being used as a criteria and the resized photo would have a different file creation date even though the Image Capture date was the same.
    They all found the 3 regular duplicates and some of them would mark two and leave the 3rd unmarked.  iPhoto Duplicate Cleaner can sort the found duplicates by file size but if the file was edited to get the reduced file size it might not be found as it would have a different file creation/modification date. 
    iPhoto Library Manage was able to find all duplicates and mark them as such if the file names were the same the the filename option was selected.  Otherwise it also missed the modified, resized version.  It allowed one to select the one photo to save before going to work on the library.
    So if a photo has been reduced in image quality or pixel size it will not be considered a duplicate.
    OT

  • After duplicate operation, file sizes(Checkpoint file size) are different

    HI
    I have a some questions.
    We are testing a 4-way Replication. After duplicate operation, file sizes(Checkpoint file size) are different in OS command(du -sh).
    Is the normal?
    TimesTen Version : TimesTen Release 7.0.5.0.0 (64 bit Solaris)
    OS Version : SunOS 5.10 Generic_141414-02 sun4u sparc SUNW,SPARC-Enterprise
    [TEST17A] side
    [TEST17A] /mmdb/DataStore # du -sh ./*
    6.3G ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST17A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 13259490
    -rw-rw-rw- 1 timesten other 501 Aug 14 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091428864 Jan 29 02:13 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4113014784 Jan 29 02:23 SAMPLE.ds0
    [TEST17A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize ;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36991
    PERM_IN_USE_HIGH_WATER: 36991
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5864
    TEMP_IN_USE_HIGH_WATER: 6757
    [TEST17B] side
    [TEST17B] /mmdb/DataStore # du -sh ./*
    911M ./SAMPLE
    453M ./SAMPLE_LOG
    [TEST17B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 1865410
    -rw-rw-rw- 1 timesten other 334 Dec 11 2008 SAMPLE.inval
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:25 SAMPLE.ds0
    [TEST17B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 432128
    PERM_IN_USE_HIGH_WATER: 432128
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5422
    TEMP_IN_USE_HIGH_WATER: 6630
    [TEST18A] side
    [TEST18A] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    410M ./SAMPLE_LOG
    [TEST18A] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:22 SAMPLE.ds0
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:32 SAMPLE.ds1
    [TEST18A] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36825
    PERM_IN_USE_HIGH_WATER: 37230
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 6117
    TEMP_IN_USE_HIGH_WATER: 7452
    [TEST18B] side
    [TEST18B] /mmdb/DataStore # du -sh ./*
    107M ./SAMPLE
    411M ./SAMPLE_LOG
    [TEST18B] /mmdb/DataStore/SAMPLE # ls -lrt
    total 218976
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:18 SAMPLE.ds1
    -rw-rw-rw- 1 timesten other 4091422064 Jan 29 02:28 SAMPLE.ds0
    [TEST18B] /mmdb/DataStore/SAMPLE # ttisql sample
    Command> dssize;
    PERM_ALLOCATED_SIZE: 8388608
    PERM_IN_USE_SIZE: 36785
    PERM_IN_USE_HIGH_WATER: 37140
    TEMP_ALLOCATED_SIZE: 524288
    TEMP_IN_USE_SIZE: 5927
    TEMP_IN_USE_HIGH_WATER: 7199
    Thank you very much.
    GooGyum

    You don't really give much detail on what operations were performed and in what sequence (e.g. duplicate from where to where...) nor if there was any workload running when you did the duplicate. In general checkpoint file sizes amongst replicas will not be the same / remain the same because:
    1. Replicas are logical replicas not physical replicas. Replication transfers logical operations and applies logical operations and even if you try and do exactly the same thing at both sides in exactly the same order there are internal operations etc. that are not necessarily synchronised which will cause the size of the files to vary somewhat.
    2. The size of the file as reported by 'ls -l' represents the maximum offset that has so far been written to in the file but the current 'usage' of the file may be less than this at present.
    3. Checkpoint files are 'sparse' files (unless created with PreAllocate=1) and so the space used as reported by 'du' will in general not correspond ot the size of the file as reported by 'ls -l'.
    Unless you are seeing some kind of problem I would not be concerned at an apparent difference in size.
    Chris

  • Copied Files To HDD - 'Get Info' File Sizes are different?

    Hi All ---
    -- not sure how to put it any other way.....maybe this is a normal thing?
    I've copied 161.78Gigs of data from the P.B. to Lacie HDD but 'Get Info' on target disk is only reading 161.61Gigs - all files seem to be in place, seem to be copied, working etc --- but thats about 17megs of info missing somewhere?
    Uh, what gives?
    should i trash, and re-do the whole lot again? hmm.
    thnx for any help in Adv ----
    John W.

    Hi Joe ---
    Thanks for that - have been offline since yest. -- seems the main P.B. drive decided to go AWOL on me - but just overnight though? No clue whats going on there -- have repaired disk, permissions, run onyx, etc..... just didnt want to boot, occasionally would boot from Ext. HDD - Drive Icon vanished, all sorts of nonsense.... and then returned this a.m. as if nothing had ever happened.
    I reckon HD is on its way out.... hmm.
    I'm now getting lots of beachballs. v. boring. Everything is backed up externally TWICE now (& cloned Drive TWICE??) - onto two separate drives - am just going to erase, reformat P.B. before reinstalling 10.3 & 10.3.9 -- off to see a Doctor/Repairman tomorrow.
    thanks for the link -- will def. have a go with that in the future.
    Cheers,
    J.

  • Is it possible to have same file size for different dpi?

    I changed one.TIFF file (300dpi, 1024X1332)  to .jpg files of four different dpi. But when I checked the four result jpg files, I found out that they are all in same file size and quality.( I also have checked the property of the files in the Windows.)
    I think more DPI means more data and more file size. Am I wrong?
    I use Photoshop CS 5.1(64bit, WINDOWS) - which is part of my Adobe Master Collection CS5.5.
    TIFF(300dpi, 1024X1332) ->
    1. JPG(72dpi, 1024X1332) : 306KB
    2. JPG(108dpi, 1024X1332) : 306KB
    3. JPG(144dpi, 1024X1332) : 306KB
    4. JPG(600dpi, 1024X1332) : 306KB
    I tested a few times more with different files. and same result comes out.(same file size for different dpi)
    Thanks in advance.

    Yes absolutely. Great observation.  PPI does not control the number of pixels, just how big they are.
    Now, if you change the PPI in the Image Size dialog with Resample checked.. then, that is a different story. In that case you will be changing the pixel dimension of an image (i.e.,changing the total number of pixels making up the image) while keeping its print size.
    In your test files, you will notice all the print sizes are different, because all you were telling Photoshop to do was change the size of the pixels (if or when the image is ever printed), which is really just a bit of metadata of the file.

  • Why the files my program create are created twice each file double ? And why sometimes the files size are too small ?

    My program is using Queue of type Uri to create Queue of urls and then i'm using webbrowser to navigate each Uri from the Queue in it's turn and get the url(html) source content and save it to the hard disk.
    The problem is sometimes the text files on the hard disk are small like 90KB or 60KB and sometimes they are as they are suppose to be 300KB or 200KB.
    This is a button click event where i'm calling two methods:
    private void toolStripButton3_Click(object sender, EventArgs e)
    GetHtmls();
    CheckQueue();
    This is the GetHtmls method code:
    private Queue<Uri> myUrls = new Queue<Uri>();
    private bool isBusy = false;
    private void GetHtmls()
    for (int i = 1; i < 49; i++)
    adrBarTextBox.Text = sourceUrl + i;
    targetHtmls = (combinedHtmlsDir + "\\Html" + i + ".txt");
    Uri targetUri = new Uri(sourceUrl + i);
    myUrls.Enqueue(targetUri);
    sourceUrl contain website address: http://www.tapuz.co.il/forums2008/forumpage.aspx?forumid=393&pagenumber=
    And i'm adding to it the numbers and create the pages.
    And add them to the Queue.
    THen the CheckQueue method:
    Uri uri;
    private void CheckQueue()
    if (isBusy)
    return; // We're downloading some page right now, don't disturb
    isBusy = true; // OK, let's get started
    if (myUrls.Count == 0) // No more pages to download, we're done
    isBusy = false;
    return;
    uri = myUrls.Dequeue(); // Get one URL from queue
    getCurrentBrowser().Navigate(uri);
    It suppose to Navigate to each Uri(html address) in the Queue.
    And the browser document completed event:
    private void Form1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
    // If page loaded completly then do something
    int urlnumber = uri.ToString().IndexOf("pagenumber=");
    string number = uri.ToString().Substring(urlnumber + 11);
    int num = Int32.Parse(number);
    targetHtmls = (combinedHtmlsDir + "\\Html" + num + ".txt");
    StreamWriter writer = File.CreateText(targetHtmls);
    writer.Write(getCurrentBrowser().DocumentText);
    writer.Close();
    isBusy = false; // We're done
    CheckQueue(); // Check next page in queue
    In the completed event i'm getting the page number and build the string for the text file then write the html source content to the text file.
    In the end i have on my hard disk 48 text files.
    The problems are:
    1. Sometimes it seems like it's not finishing navigating to the current uri or maybe some other reason maybe server side problem and creating small size files with source content inside but not all the source content. Sometimes the text files size are each
    file 99KB or 70KB and sometimes the size of them are about 300KB and 200KB and this is the right sizes 300KB 200KB.
    2. The text files on my hard disk suppose to be 48 different files each file should contain the source if the html page of the 48 pages. But on my hard disk the 48 text files are duplicated for some reason.
    This is the file on my hard disk:
    Some of the files are 205KB 350KB 175KB and some of the files sizes are 85KB 94KB 35KB 
    Why some of the files it didn't navigated to the end or maybe didn't got all the source ?
    And why it's making each second file the same like the one before ? It suppose to create 48 different files but i'm getting two identical files each navigation.

    I solved it now.
    This is what i did:
    It's a bit slow process since i'm waiting for each page to be loaded into the webbrowser but it does the work.
    using System;
    using System.Collections.Generic;
    using System.ComponentModel;
    using System.Data;
    using System.Drawing;
    using System.Linq;
    using System.Text;
    using System.Threading.Tasks;
    using System.Windows.Forms;
    using System.Collections;
    using System.IO;
    using System.Net;
    namespace WindowsFormsApplication1
    public partial class Form1 : Form
    private string sourceUrl = "http://test.test";
    private string htmlsTargetDirectory = "Test Htmls";
    private string appDir = Path.GetDirectoryName(@"C:\Users\chocolade1972\AppData\Local\Test_Images\Test Images\Test Htmls");
    private string combinedHtmlsDir;
    private String targetHtmls;
    private int counter = 1;
    private StreamWriter w;
    private string uri;
    private bool htmlloaded = false;
    public Form1()
    InitializeComponent();
    webBrowser1.ScriptErrorsSuppressed = true;
    combinedHtmlsDir = Path.Combine(appDir, htmlsTargetDirectory);
    if (!Directory.Exists(combinedHtmlsDir))
    Directory.CreateDirectory(combinedHtmlsDir);
    private void Form1_Load(object sender, EventArgs e)
    GetHtmls();
    timer1.Enabled = true;
    private void GetHtmls()
    uri = sourceUrl + counter;
    targetHtmls = (combinedHtmlsDir + "\\Html" + counter + ".txt");
    webBrowser1.Navigate(uri);
    private void webBrowser1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
    if (e.Url.ToString() == uri)
    targetHtmls = (combinedHtmlsDir + "\\Html" + counter + ".txt");
    htmlloaded = true;
    StreamWriter writer = File.CreateText(targetHtmls);
    writer.Write(webBrowser1.DocumentText);
    writer.Close();
    FileInfo fi = new FileInfo(targetHtmls);
    var size = fi.Length;
    w = new StreamWriter(combinedHtmlsDir + "\\test.txt", true);
    w.WriteLine("File Size " + size);
    w.Close();
    private void timer1_Tick(object sender, EventArgs e)
    if (htmlloaded == true)
    uri = sourceUrl + counter;
    myurls.Add(uri);
    webBrowser1.Navigate(uri);
    htmlloaded = false;
    counter += 1;

  • Spooling of a query generates different file sizes for different databases

    Please help me regarding a problem with spooling. I spooled a query output to a file from two different database. In both the databases the table structure is the same and the output produced only one row. But the file size is different for the databases. How can this problem occur? Is there any database parameter need to be checked? Both the databases are in same version 10.2.0.1.0.
    before running the spool i did a
    sql> set head off feedback off echo off verify off linesize 10000 pages 0 trims on colsep ' '
    on both the sessions.
    In one database the filesize is *1463 bytes* and on the other the filesize is *4205 bytes*.
    Please help me to find out these discrepancies.

    hi Mario,
    I think you are not getting my point. Both the files contain the same output but their sizes are different. This is due to the no of blank spaces between columns. I wanted to clarify why there is a difference between two filesize when the query output is the same.

  • FCP file sizes are twice the size of Quicktime's - any advantage to FCP?

    When I capture footage from my Canon HV40 MiniDV using FCP, the file sizes are almost twice the size as when I capture using Quicktime's Device Native setting. Is there any advantage to capturing with FCP, as the quality seems the same in QT? Or is there a way to reduce the file size being captured in FCP?
    I'd like to be able to capture in FCP because if there is a dropped frame in the recording, Quicktime stops recording, whereas FCP keeps going and starts a new file. But the larger file size in FCP puts me off.

    Not all codecs are editable...meaning, not all work in an editing application. I'm not sure what "device native" captures as in QT. Does it say in GET INFO? But FCP captures HDV and DV native, but in a native format that it can work with. HDV comes in as HDV...Dv as DV.
    I wouldn't trust what QT is capturing as DEVICE NATIVE. And 13GB/hour of footage is not that much. That's the lowest data rate format out there.
    Shane

  • FCE file sizes are twice the size of Quicktime's - any advantage to FCE?

    When I capture footage from my Canon HV40 MiniDV using FCE, the file sizes are almost twice the size as when I capture using Quicktime's Device Native setting. Is there any advantage to capturing with FCE, as the quality seems the same? Or is there a way to reduce the file size being captured in FCE?
    I'd like to be able to capture in FCE because if there is a dropped frame in the recording, Quicktime stops recording, whereas FCE keeps going and starts a new file. But the larger file size in FCE puts me off.

    There is no difference in the two files. They are both DV Stream (.dv).
    If you look in the Movie Properties window you'll notice that both tracks (audio and video) have the same file size.
    Obviously, this can't mean that the audio track is really as large as the video track in data size. It means that the only way QuickTime can "separate" the stream into two editable tracks is to duplicate them (but ignore the video playback of the audio track portion).
    I don't use Final Cut but I would suspect it might make two files (audio and video) instead of a single file made by QuickTime Player Pro.
    Confusing, isn't it?

  • Trying to download update to CoPilot Live and CoPilot GPS with maps.  files sizes are large and taking hours to download on wireless connection.  How can I download App updates and new maps while connected to PC and Itunes through hard wire internet link?

    Trying to download update to CoPilot Live and CoPilot GPS with maps.  Files sizes are large and taking hours to download on wireless connection.  How can I download updates and new maps while connected to PC and Itunes through hard wire internet link?

    I'm on my iPad, so I don't know if this is the page with an actual download. I don't see a button, but assume that is because I  am on an iPad. It is in the DL section of Apple downloads.
    http://support.apple.com/kb/DL1708

  • File sizes are coming about a bit big

    Okay so I've been having this problem for a few days now, and I can't figure out how to fix it. I am rendering a 30 second clip recorded in 1080p in a quicktime format, with H.264 compression and no effects added and the file size is coming pretty large. I also tried it in H.264 format with the same codec and the file size is pretty much the same. Any help is appreciated. Thanks in advance.

    You say you've been using the same render settings you've always used and the same workflow you've always used, but the result is different. Clearly, something has changed. Computers aren't whimsy-driven, drama-loving, capricious machines. They only do what they are told. Now, it could be that something outside of your control changed, but it would help us all try to help you figure it out if you gave more information.
    Have you installed any software recently? Have you installed any codec packs recently (such as k-lite)? Has your computer updated itself recently (Adobe software, Windows, etc.)?
    Screenshots of your AE render settings would be useful. Are you running version 11.0.4 of AE? What OS (I'm guessing Windows, but which one)? Etc.

  • LMS3.0.1 Log files' size are not purged.

    There is a LMS 3.0.1.
    Every day size of all log files only are increased.
    I have a 12G particion for log files but it's full every 2 weeks.
    In this case I have to stop LMS, delete all log files and start LMS manually every two weeks.
    Also, there is following message:
    CiscoWorks Folder Utilization is 8%Processes DFMCTMStartup, DataPurge are down.
    I tried to start these services several times but it failed all time.
    Hot to make LMS to prevent of increasing of log files over Recommended Size Limit?

    These two processes should be done in normal operation.  Do not try and start them manually.
    To rotate log files periodically, you will need to configure logrot.  Search the Common Services online help for "logrot", and you will get the instructions for configuring it.  Essentially, you will need to run NMSROOT/bin/perl NMSROOT/bin/logrot.pl -c, then walk through the menu prompts to configure the files you want to rotate.

  • Adobe Illustrator CS5 File sizes are huge!

    Try this:
    open a new Illustrator document
    create a simple box, black stroke, white fill
    save the file
    you're looking at something like 1 Mb for these 4 anchor points and the fill and stroke definition!
    now, back in Illustrator, downsave to Illustrator 3 format
    file size is in around 6Kb - much better!
    Okay, I know we can disable ICC profile embedding, PDF compatibility etc, but none of that even comes remotely close to the "correct" file size of < 10Kb.
    C'mon Adobe, what are you putting in that CS5 file that's so darned important? This isn't a complex illustration with gradient masks, 3D transforms and thousands of points.
    I get it, everyone has terrabyte drives and high-speed internet. Is that an excuse to make really inefficient file formats?

    hmmmm......
    As Monika pointed out... deleting the defualt library items will make files sizes drastically smaller.
    I'd guess that version 3, since it lacked the ability to work in preview mode, didn't save file previews with the file either... that could easily account for 50kb.

  • My jpeg file sizes are much larger than they used to be.

    Just recently the file sizes on my .jpeg files have become excessively larger than they used to be just a week or so ago. Where I used to go almost a week to need to archive to DVD a bunch of saved images, I've now got to save to DVD almost daily.
    For example, a 300x300 pixel image which shoud be about 32 Kb reads 383 Kb in the finder. I'm not doing anything to these images, just draging and droping.
    Where I used to get 7 K files on a DVD, I'm now getting less than 2 K files to each disc.
    Any thoughts on what's changed and how to fix it?

    What App & JPEG compression rating are you using, some apps can reduce file size by dropping colors for instance...

  • Master database logical file names are different - Please Help!

    Today I noticed that one of the server in our environment,
    when i run 
    SELECT * 
    FROM sys.database_files
    i got results 
    data_space_id name
    1 master2
    0 master_Log2
    Please note logical name of the master database files.
    But, in SSMS if i right click and choose properties for master database, it shows logical file names are master, mastlog.
    How can this be happened? How can i fix this issue and make it both same names for master database?
    Please help!

    Today I noticed that one of the server in our environment,
    when i run 
    SELECT * 
    FROM sys.database_files
    i got results 
    data_space_id name
    1 master2
    0 master_Log2
    Please note logical name of the master database files.
    But, in SSMS if i right click and choose properties for master database, it shows logical file names are master, mastlog.
    How can this be happened? How can i fix this issue and make it both same names for master database?
    Please help!
    Its nothing to worry about may be master was restored from backup which had logical file name as master2
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it
    My Technet Wiki Article
    MVP

Maybe you are looking for