File length different for a copied file. Or use checksum

Hi
I am making a backup of a file before doing some writes to the original.
I first check it out of source control, then make a copy using:
public static void backupFile(File f)
try{
File backup = new File(f.getPath()+"_BACKUP");
if (!backup.exists()) {
if (!backup.createNewFile()) {
Logger.getLogger().log("Could not create "+backup.getPath());
SystemTools.copyFile(f,backup);
} catch (Exception e)
Logger.getLogger().log("Error backing up "+f.getPath()+ ": "+e);
public static synchronized void copyFile(File from, File to) throws Exception
if (!from.exists() || !from.isFile())
throw new Exception ("copyFile Error: checking 'from' file!");
if (!to.exists())
if(!to.createNewFile())
throw new Exception ("copyFile Error: creating 'to' file!");
FileInputStream in=new FileInputStream(from);
FileOutputStream out=new FileOutputStream(to);
int length;
byte[] buffer=new byte[256];
while((length=in.read(buffer))!=-1) {       
out.write(buffer, 0, length);
in.close();
out.close();
After writing has finished, I need to see if the file is different to the backup..
If so, I need to check it into source control.
I wanted to use a checksum, but couldn't find an example that actually worked!!! Therefore I did a quick tool:
public static boolean isBackupIdenticalToOrig(File f) throws Exception {
File bu = new File(f.getPath()+"_BACKUP");
Logger.getLogger().log("Lengths: New/Backup"+f.length()+"/"+bu.length());
if (bu.length()!=f.length())
return false;
// Have the same lengths, so we can compare!!
BufferedInputStream f_in = null;
BufferedInputStream bu_in= null;
f_in =new BufferedInputStream (new FileInputStream (f));
bu_in =new BufferedInputStream (new FileInputStream (bu));
for (int i=0;i<f.length();i++)
int c = f_in.read();
int d = bu_in.read();
if (c!=d)
Logger.getLogger().log(""+f.getName()+" has been modified");
return false;
Logger.getLogger().log(""+f.getName()+" has not been modified");
return true;
The problem is in that the File.length() method is returning different values for the backup file, than for the original, even if identical!
For example:
10/15/2002 10:22:05: Lengths: New/Backup413/402
10/15/2002 10:22:06: Lengths: New/Backup397/386
10/15/2002 10:22:07: Lengths: New/Backup191/185
All the new files are longer that the backup, but the contents are exactly the same! Is there some WIN32 'extras' in the file that's causing a problem here???
In each of the cases, if I open the new(Longer) file in a good editor, I can see that the lengths are correct. But no extra characters existed in the new file compared to the backup!!
Any ideas would be most appreciated
Cheers
Chris

10 and 13 are CR(carriage return) and LF(linefeed) - this is normal for a Windows file. Use this copy routine; it works.
   // copyFile -  input: inFile -  path to source file
   //                    outFile - path to copy file to (including filename)
   //                    bRemoveSource - true removes the source files, false leaves them intact
   //             returns: void
   public static void copyFile(String inFile, String outFile,boolean bRemoveSource) throws IOException
      FileInputStream fin     = null;
      FileOutputStream fout   = null;
      boolean bCanWrite       = false;
      boolean bDeleted        = false;
      // To preserve date/time stamp
      File fTimeIn = new File(inFile);
      File fTimeOut = new File(outFile);
      long lTimeIn = fTimeIn.lastModified();
      try
        fin  = new FileInputStream(inFile);
        fout = new FileOutputStream(outFile);
        copyStream(fin, fout);
      finally
        try
          if (fin != null) fin.close();
        catch (IOException e) {}
        try
          if (fout != null) fout.close();
        catch (IOException e) {}
      fin.close();
      fout.close();
      // Set out time to in time
      fTimeOut.setLastModified(lTimeIn);
      if (bRemoveSource == true)
         bCanWrite = fTimeIn.canWrite();
         if (bCanWrite)  {
            bDeleted = fTimeIn.delete();
   // copyStream (a helper function for copyFile) -  input: in  - file stream of source file
   //                                                       out - file stream of destination file
   //                                                returns: void
   // *** NOTE: This function is thread safe ***
   public static void copyStream(InputStream in, OutputStream out)  throws IOException
      // do not allow other threads to read from the
      // input or write to the output while copying is
      // taking place
      synchronized (in)  {
         synchronized (out)  {
            byte[] buffer = new byte[256];
            while (true)  {
               int bytesRead = in.read(buffer);
               if (bytesRead == -1)
                  break;
               out.write(buffer, 0, bytesRead);
   }

Similar Messages

  • Automatic file length setting for continuous recording?

    In iMovie 6 I was able to do long duration ( all night) recordings with automatic file splits at ten minutes. Hence, I would have 6 manageable files per hour.
    I can't find a way to set file lengths in iMovie 9.
    Is there some place in the preference files where I can set the length?
    Thank you in advance.
    Thomas

    Is there a length/size limit that GB can import?
    http://www.bulletsandbones.com/GB/GBFAQ.html#recordlength
    I made an 85-minute mp3 recording (with an Olympus LS-10) that does not successfully import
    http://www.bulletsandbones.com/GB/GBFAQ.html#PortableRecorderMp3s

  • Archiving  file in different folder in sender file adapter

    Hi,
    I have a requirement in witch i have to pick 7 different file from seven different folder.
    for this I am using advance selection tab and defining all the other folder path.
    but my problem is I have to archive the different file in different folder .
    For exm..
    /folder1/file1.txt need to be archive in /archive_filder1/file1.txt
    /folder2/file2.txt need to be archive in /archive_filder2/file2.txt
    /folder3/file3.txt need to be archive in /archive_filder3/file3.txt
    can any one tell me that how to  define the dynamic  archive folder path??.
    regards,
    Navneet

    Hi Navneet
    In File Access parameters --> Target Directory
    Use this type of Directory name
    \XIDEV_STORES\%store%\sapoutbound
    here %store% will be changed accordingly
    e.g. if file will come for store 001 then this %Store% will be replaced by 001 and file will be placed in this 001 folder and if file will come for store 002 then this %Store% will be replaced by 002 and file will be placed in this 002 folder over FTP.
    For to achieve this Go to Advanced tab 
    For Variable Substitution check mark Enable
    Now speciofy the parameter values for Variable Name and Reference
    Under Vriable name specify = store
    Under Reference specify = payload:WP_PLU02,1,IDOC,1,EDI_DC40,1,RCVPRN,1     -
    You have to give full path name here that where this store variable is located into IDOC.
    Hope this example will help you
    Regards
    Dheeraj Kumar

  • File Adapter: Different lines in one file

    Hi,
    I receive a flat file where I have to do content conversion.
    In this file, I have different structures which could appear unsteady.
    Means, every could structure appear in a non continuous way:
    File:
    struc1
    struc2
    struc3
    struc3
    struc3
    struc1
    struc1
    struc2
    struc3
    How to handle this in file adapter?
    Regards
    Chris

    Hi,
    do you mean like this:
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    How to identify the different structures? Every structure got different fields and different
    amount of fields.
    Every struc1 to struc3 should be a single message.
    And how to specifiy line break in file adapter?
    thanks
    chris

  • XI30 File Adapter - check for (NOT) incoming file

    Hi!
    We use XI30 SPS15 and have the following challenge: We expect every
    week on a certain day a file coming from a customer. So far so good.
    But now we want to check whether such a file has really come within a
    given period of time and if not, we want to take some action (e.g.
    trigger Alert or s.th. like that).
    Business Background is: If the customer does not send the file or he
    sends it too late, we cannot bill him in time, thus increasing our DSO.
    Is there any mechanism within the XI to check things like these and to
    trigger an alert? I've read the File Adapter Documentation as well as
    the BPM documentation but I could not find any hint.
    Any help would be great. Thanx!
    Regards,
    Volker kolberg

    hi volker,
    this is not supported in standard but...
    you can do it very easily with standard ABAP job scheduling
    - you schedule a job (every week, day etc.)
    which starts a raport that send a RFC (or abap proxy call to the XI)
    - then the RFC starts a BPM and inside it triggers a
    java proxy that checks for a file
    - if the proxy finds the file then if copies it to some other folder (which is monitored by another flow)
    - if it doesn't find the file it triggers an error - sends mail or anything
    the only thing you need to code is a few lines in java
    to check the existance of a file + copying nothing else:)
    this is the easiest way I believe till now <= Sp15
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions">XI FAQ - Frequently Asked Questions</a>

  • The first binary file write operation for a new file takes progressively longer.

    I have an application in which I am acquiring analog data from multiple
    PXI-6031E DAQ boards and then writing that data to FireWire hard disks
    over an extended time period (14 days).  I am using a PXI-8145RT
    controller, a PXI-8252 FireWire interface board and compatible FireWire
    hard drive enclosures.  When I start acquiring data to an empty
    hard disk, creating files on the fly as well as the actual file I/O
    operations are both very quick.  As the number of files on the
    hard drive increases, it begins to take considerably longer to complete
    the first write to a new binary file.  After the first write,
    subsequent writes of the same data size to that same file are very
    fast.  It is only the first write operation to a new file that
    takes progressively longer.  To clarify, it currently takes 1 to 2
    milliseconds to complete the first binary write of a new file when the
    hard drive is almost empty.  After writing 32, 150 MByte files,
    the first binary write to file 33 takes about 5 seconds!  This
    behavior is repeatable and continues to get worse as the number of
    files increases.  I am using the FAT32 file system, required for
    the Real-Time controller, and 80GB laptop hard drives.   The
    system works flawlessly until asked to create a new file and write the
    first set of binary data to that file.  I am forced to buffer lots
    of data from the DAQ boards while the system hangs at this point. 
    The requirements for this data acquisition system do not allow for a
    single data file so I can not simply write to one large file.  
    Any help or suggestions as to why I am seeing this behavior would be
    greatly appreciated.

    I am experiencing the same problem. Our program periodically monitors data and eventually save it for post-processing. While it's searching for suitable data, it creates one file for every channel (32 in total) and starts streaming data to these files. If it finds data is not suitable, it deletes the files and creates new ones.
    On our lab, we tested the program on windows and then on RT and we did not find any problems.
    Unfortunately when it was time to install the PXI on field (an electromechanic shovel on a copper mine) and test it, we've come to find that saving was taking to long and the program screwed up. Specifically when creating files (I.E. "New File" function). It could take 5 or more seconds to create a single file.
    As you can see, field startup failed and we will have to modify our programs to workaround this problem and return next week to try again, with the additional time and cost involved. Not to talk about the bad image we are giving to our costumer.
    I really like labview, but I am particularly upset beacuse of this problem. LV RT is supposed to run as if it was LV win32, with the obvious and expected differences, but a developer can not expect things like this to happen. I remember a few months ago I had another problem: on RT Time/Date function gives a wrong value as your program runs, when using timed loops. Can you expect something like that when evaluating your development platform? Fortunately, we found the problem before giving the system to our costumer and there was a relatively easy workaround. Unfortunately, now we had to hit the wall to find the problem.
    On this particular problem I also found that it gets worse when there are more files on the directory. Create a new dir every N hours? I really think that's not a solution. I would not expect this answer from NI.
    I would really appreciate someone from NI to give us a technical explanation about why this problem happens and not just "trial and error" "solutions".
    By the way, we are using a PXI RT controller with the solid-state drive option.
    Thank you.
    Daniel R.
    Message Edited by Daniel_Chile on 06-29-2006 03:05 PM

  • CCMS File Monitoring - Autoreaction for every new File

    Hi Experts,
    I have a folder with xml files. I monitor these files with the sapccmsr agent. this works fine. Now I want to establish an alerting for the files. If in one of these files contains the pattern "error", I want to generate an alert. I have generated a logfile template like here:
    LOGFILE_TEMPLATE
    DIRECTORY="E:/usr/sap/prfclog/sapccmsr/Test/"
    FILENAME="*.xml"
    MONITOR_FILESIZE_KB=1
    PATTERN_0="Error"
    VALUE_0=RED
    if in a file the pattern "Error" exists the colour in rz20 is changed to red. this is working ...
    now I come to my problem:
    the names of the monitored files are changing. for example there is a file with the name "file_0.xml". the next day a new file with the name "file_x.xml" exists. so, for the alerting with autoreaction it is necessary that the system gives the autoreaction to each file in the monitor automaticly. is it possible? how can i solve this problem?
    best regards
    christopher

    Hello Christopher,
    > the names of the monitored files are changing. for example there is a file with the name "file_0.xml". the next day a new file with the name "file_x.xml" exists.
    you monitor all theses files. The wildcard in the parameter "FILENAME" ensures this:
    > FILENAME="*.xml"
    > so, for the alerting with autoreaction it is necessary that the system gives the autoreaction to each file in the monitor automaticly. is it possible? how can i solve this problem?
    A short look into the [online documentation|http://help.sap.com/saphelp_nw70/helpdata/EN/cb/0f8897c23b414ab6e562aa913971bc/frameset.htm] returns the parameters AUTOREACTION_LOC and AUTOREACTION_CEN.
    Maybe they will be helpful?
    Regards, Michael

  • D600 RAW files look different that D300 raw files

    Please help me understand what's going on and more importantly, how to fix it.
    RAW files from my D300 and my D600 look completely different. These are both NEF files, imported to Lightroom 4.2 using exactly the same import preset. I did NOTHING to either file, except that the D300 one was at 1/160 second and the D600 one was at 1/200 second, so I changed exposure on the D300 one by 1/3 stop so they would match. What I did was to shoot the D300 picture, then carefully changed the lens over to the D600 without moving anything, changed the D600 to DX mode to match the field of view and shot the same picture. The files should be essentially identical.
    [b]This is from the D300:[/b]
    [b]This is from the D600:[/b]
    You will notice that the saturation on the D600 image is much higher. And it's much more yellow. The pictures were shot 2 minutes apart, from the same spot. Same lens, same camera settings.
    First thing I did was to change the color temperature and tint to match: from  4150/+6 to 4200/+2. There was no visible change (I'm looking at them in compare mode in Lightroom on a calibrated monitor).
    Next, I tried to change the D600 image to match the D300 one. I had to make substantial changes to both color balance and saturation of various colors, as well as contrast and black levels to get close. I'd give you absolute numbers but it varies from image to image. In this case, I had to drop the color temperature to 3500, for instance. The easiest way to do this was to use the white balance eyedropper on a grey area in the road.
    I'm wondering if this is related to how LR handles the D600 NEF files. LR 4.2 says the D600 algorithm is 'preliminary'. Could this be part of it?
    Anyone else here have a D600 and noticed the same thing?
    Message title was edited by: Brett N

    I'm not really good at using the right technical terms. Let me go back to basics.
    Adobe came out with Lightroom 4.2 which has (preliminary) support for the D600. I installed it. I put the memory card from the camera into the card reader and Lightroom popped up because that's what I had set for the default for uploading photos. I told it which folder to store the image in (J:photos/2012-10/2012-10-05) and where to put a second copy (D:/px/lightroom import copies). Then I told LR to "save Current Settings as New Preset..." and I named it "D600 imports". Then I clicked "Import".
    When I looked at the files afterwards in Lightroom, the sliders were all at zero except for the ones I noted before. I don't know why Lightroom chose 25 for the sharpening, but it did. Coincidentally, that's the same number it chooses for my D300 imports. I also don't know why it chose a color temp of 4200 and a tint of +4 (I was shooting on auto white balance).
    Then I plugged in the card from the D300. I decided not to touch the preset (which still said "D600 imports") and I clicked "Import". The files went to the same folders and presumably, had the same sharpening and other slider levels applied.
    Now when I look at the camera calibration in the develop module, under profile it says "Adobe Standard" on the D300 images and "Beta" on the D600 images, so Lightroom somehow knows which camera I used and sets itself accordingly.
    At no time did my fingers ever leave my hand, and I did not modify, click on or even breathe on any of the Lightroom default settings. FWIW, my programming days are 35 years behind me and I'm a total non-tekkie user now. I wouldn't know how to 'hack' the D300 files if my life depended on it.
    I know that conditions could have changed in the 98 seconds between the D600 shot and the D300 shot. Trust me when I say they didn't.
    Regarding Noise Reduction: as far as I know, there are only two user accessible settings in the cameras: "Long Exposure Noise Reduction" and "High ISO Noise Reduction". Generally I have both of those on... but since these shots were at 1/200 sec and ISO 400, they wouldn't have kicked in anyway.
    For WB, I used exactly the same spot on the road in both images. But you're right, I should use a grey target and I will next time.
    To make the two pictures match as best I could tell onscreen (I changed the D600 image to match the D300 one), I had to change the temp from 4200 to 3500, the tint from +4 to +18, the exposure to -.33, shadows to -48, blacks to -19 (these are huge changes) and it still didn't look as good as the D300 image. It was 'harder' or 'crisper'. There was no mood, no softness to it. It's like looking at a "vivid" Jpeg vs. a "Standard" Jpeg, or a cooked HDR, if that makes any sense.
    This isn't so much a complaint as a request for understanding, so that I know when I bring up an image, how to fix it. I shot a landscape and blew it up onscreen to 200%. I could see every leaf, every twig, every shadow's hard edge. That's not a bad thing, that's amazing: I just want to be able to control it.
    Glenn

  • Move file or mark for system move file - How?

    Hello,
    I finally bought Lightroom. Searching these posts I saw someone commenting about why a person would want to use LR to move a file. Seriously? Because I Want it to be a full on DAM solution. And the first mess I must de-tangle involves some years of very bad habits, both in iphoto and in my own head.
    Therefore, there is something that I don't think Lightroom can do yet but I would love to be found wrong. I don't think LR is much help in marking for the system to find later out side of LR, and then moving the files, right, wrong?
    I don't care if it marks the file in a way that the system can later find it and move it, or, if LR moves a selected file set already in LR. Either way what I am trying to do is organize correctly and that must mean moving found sets of files to where they belong.
    It seems like overkill to purchase iView just for this one task. Qpict is a lot cheaper than iView ($35 vs $199) and can move files around. I was just wondering though why LR cant yet do this? OR, am I wrong? Does anyone but me think this is important? Any others requesting this (LR is missing applescript support on the Mac side as well)
    Thanks
    David

    levelbest,
    If I understand your question, you've got many images in iPhoto that you want to reorganize and import into Lr. Is that the issue?
    If so, there are various ways to tackle it. One easy way is to create the new folders as you go.
    On your Desktop, make a new folder (Shift-Cmd-N). Name it "For Lr Import."
    Now open iPhoto. Scroll back to your first image in iPhoto. Let's say it's from the year 2000. Select that first image plus the photos you want to group with it. Maybe they are all shots of Niagara Falls. Export only those photos (full-size) to that Desktop folder you just made: "For Lr Import."
    Now switch to Lightroom. Go to File: Import Photos (or Shift-Cmd-I). In the window that opens, navigate to Desktop>For Lr Import. Select all the photos in that folder (click on the the first and then Cmd-A), and click Choose.
    That window will close and the Lr Import dialog will open. Your choices here will depend on what kind of files you're working with - jpeg or raw, and what you want to do with them. The options are easy to figure out. This method works faster if you choose a "move" option rather than a "copy" option because it will move the images out of the desktop folder and into whatever folder you create for them. "Copy" leaves them sitting in that Desktop folder, and you've gotta clear 'em out manually before the next export/import -- an extra step each time.
    So here's one way to go (in the Lr Import dialog):
    Under File Handling, select "Move Photos to a new location and import."
    Under Move To, navigate to Pictures. Create a new folder. Name it Photos 2000. Inside that folder, create another new folder. Name it Niagara Falls.
    Under Organize, select By Date and pick a date format.
    Don't miss taking advantage of the backup option. Check the "Backup To" box and create a similar hierarchy on a different drive. You could call the top-level folder Photo Originals.
    Now you've created a hierarchy you can continue using with each export from iPhoto. Just keep adding folders to Photos 2000 until you exhaust that year. Then create Photos 2001 the same way. All your photos will eventually be organized by year>topic>date (or however you set up the hierarchy).
    If you are converting raw to dng during import, the process is the same. You can use the backup function to preserve your raw files.
    Good luck,
    Jack

  • SID S-1-5-18 trying to copy a file - JAVA Service trying to copy files

    I have a JAVA.EXE Service, a IBM/MAXIMO application server.
    The Service is running on ServerA and the shared folder is on ServerB
    The service is trying to copy a file from  the remote system (SERVERB) using a UNC path (it could be use a drive letter too) and the file have to be stored on ServerA
    But the result is : "Access Denied"
    I´ve tried to configure SYSTEM, NETWOR< NETWORK SERVICE in the NTFS permissions, but nothing works. Everyone does not work too.
    PROCMON states that the local attempt is using the well-know SID  S-1-5-18 (SYSTEM, see https://support.microsoft.com/KB/243330?wa=wsignin1.0)
    "FAST IO DISALLOWED",""
    "OBJECT PATH INVALID","Desired Access: Read Attributes, Dis, Options: Open For Backup, Open Reparse Point, Attributes: n/a, ShareMode: Read, Write, Delete, AllocationSize: n/a"
    The NetMon shows Access Denied too
    2293 11:08:16 23/12/2014
    1118.8269458 System
    SRV-DTC-137 BRDC1-SRV0024
    SMB2 SMB2:R  - NT Status: System - Error, Code = (34) STATUS_ACCESS_DENIED  CREATE (0x5) , File=NULL@#2292
    {SMB2:1648, SMBOverTCP:1645, TCP:1644, IPv4:71}
    2454 11:08:59 23/12/2014
    1161.8779369 BRDC1-SRV0024
    SRV-DTC-137 SMB2
    SMB2:R  - NT Status: System - Error, Code = (34) STATUS_ACCESS_DENIED  TREE CONNECT (0x3)  
    {SMBOverTCP:1708, TCP:1707, IPv4:71}

    Hi Kay,
    Glad to hear that the issue is solved!
    Thank you very much for sharing, your solution is very beneficial to other people who have similar issues.
    Please feel free to let us know if you encounter any issues in the future.
    Best Regards,
    Amy
    Please remember to mark the replies as answers if they help and un-mark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact [email protected]

  • Software Update File Size Different than Apple Downloads File Size

    Background:
    I am running software update and it retrieves the update for "MacBook, MacBook Pro Software update 1.1" File size is listed as 768K. When I go to the Apple Downloads page for this update, the file size is 979K.
    Download ID is 16618 and URl is http://www.apple.com/downloads/macosx/apple/macosx_updates/macbookmacbookprosoft wareupdate11.html
    Question:
    Is this discrepancy normal based on the way that Software Update calculates file size or should I be concerned? There is no hash to confirm integrity of the file.

    File size, and the volume of space it takes up are 2 different things. Space will very from computer to computer. Chances are, the larger the drive the more space it takes up. Reason? The file system (HSF + on macs with Leopard). The file sytem divides the drive into specific number of sections. That number may be 10 M, say. So, even if you have a thumbnail that is 4 kb, it will still require the minimum size of the file system. Make sense? There is a lot of wasted space on very large drives.
    Note: Leopard is set up to use a new file system that apple may soon adopt. It will not use the division method HFS + uses, and so will save a lot of space, if adopted.

  • File content conversion for Pipe delimited file

    Hi
    i have a scenario ( file-xi-proxy) in whch file is coming in a pipe delimited.
    my data type is like
    DT_ XXXXX
    AwardInfo
    Header contains some fileds
    DetailRecord contains some fileds
    trailer contains some fields
    what are content conversion parameters i have to use
    venkat

    Sedamkar,
    Expecting you have one header, multiple details and one trailer then give recordset structure in sender file communication channel as:
    Header,1,DetailRecord ,*,trailer,1
    In content conversion you should give parameters:
    Header.fieldSeparator : |
    Header.endSeparator : 'nl'
    DetailRecord.fieldSeparator : |
    DetailRecord.endSeparator : 'nl'
    trailer.fieldSeparator : |
    trailer.endSeparator : 'nl'
    You may need to change the parameters also according to your strcuture and the file layout. See this SAP help for file content conversion:
    http://help.sap.com/saphelp_nw04/helpdata/en/e3/94007075cae04f930cc4c034e411e1/content.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/content.htm
    Regards,
    ---Satish

  • DMS file opening problem for docx & xlsx files.

    Hi All,
    while opening docx & xlsx files data is getting truncated when they are transmited as external attachments from SRM with ERP PO mail through DMS.
    I feel there should be some problem with DMS itself as file format doc, xls are opening correctly as they were mapped with work station type DOC & XLS.
    Need your iputs to rectify the problem.
    Thanks,
    Ramakrishna

    Hi Ramakrishna,
    Kindly maintain the settings in SPRO->Document Management->General Data->Define Workstation Application (DC30).
    For XLS application define file format as *.xlsx and DOC application maintain file format as *docx.
    Hope this might resolve ur issue.
    Regards
    Bhanu

  • I am having problems converting raw files to jpeg for printing. I am using 90 quality, 300 resolutiong, srgb, no sharpening and no resizing.  when u print to meijer the pic are a little grainy.  can soneone pkease helo me understand what settings to use?

    When I take my cd to meijer to print the pic are a little grainy.  Not as crisp as they look on the computer. When I convert raw to jpeg I am using 90 quality 300 resolution, no resizing no sharpening, srgb. Can someone please help me understand what setting to use for printing. I want to be able to out my pic ok on a cd and take them to get prints.

    kann527 wrote:
    I am new to this so I am not sure what the file size is since I'm not resizing it.
    The question asked about exported image size (height and width) in pixels, not file size. Your operating system can tell you, or if you import the exported image into Lightroom, then Lightroom can tell you in the Metadata Panel, set the dropdown to EXIF.
    I am printing a 5x7. I know I have tried to print with long edge set at 1024 and set to 300 ppi and had the same result with the grainy ness.
    If the exported file has a long edge of 1024, then you did resize the image, there no other way to turn a RAW in to a JPG with long edge 1024. In any event, 1024 pixels is probably not large enough to fill 7 inches, that's really 1024/7=146 pixels per inch (rounded-off), and could be the cause of the "graininess" that you see, although I would use the word "pixelization" to describe what you see. So I'm not sure we are talking about the same thing, but let's go with it for now.
    The solution depends on your answer to the original question, and probably depends on a clarification of how you got the image to 1024 on the long edge even though you claim you didn't resize it.

  • Jar file name Required for IApplicationPropertiesService class file

    Hi
       I am using EP6 SP15, Please let me know the jar file name for IApplicationPropertiesService.class
    Regards
    Ganesan S

    Hi,
    you should have it in your NWDS plugins directory. I have it here:
    <i>\Program Files\SAP\JDT\eclipse\plugins\com.sap.km.rfwizard\lib\bc.rf.global.service.appproperties_api.jar</i>
    Romano
    PS: you can try to find it yourself - not so hard: just search in the filesystem for XXXXXX.class including searching archives...

Maybe you are looking for

  • Disbaling a value of a lookup column not to display the underlying record

    I have a lokup column to a different list. So, when it displays the record the data is displayed in hyperlinked way so that when we click on the hyperlink, it will take us to the view form of that record in a different list. Could you please let me k

  • Runtime error in SELECT statement

    While defining a structure for join in my program it has two fields both date of type erdat one from table vbak and other from custom table. If i use data element erdat to define two dates as date1 type erdat, date2 type erdat its giving a dump, but

  • Can i watch whole movies(dvds) in my ipod video?

    i would really like to watch whole movoes (dvds) in my ipod video but i don't know if its possible?

  • Which methodoloy to follow for Develop WebDynPro Project.

    Hi All, We are developing complete WebDyn project.Having team, server is at some location A and Team is some other location B. J2EE Server ,SLDconfiguration and R/3 System located at location A. Our team sits at location B. My doubts are like 1.Which

  • Download update error message

    downloading the update for adobe photoshop cloud I mac version I got error code A12E1. Any idea if it will cause problems, or how to solve it?