Compress size

file to file scenario in pi 7.1, sending multiple files with large size from file to file but we need to compress the size of the file is there any chance to compress in sap pi 7.1?

hi Jhon,
As per my knowledge, that is possible with PayloadZipBean and if you use AAE concept then it will improve performance also (this is new concept in pi 7.1).
*1.     You have to use AAE concept * because
SAP PI 7.1 supports adapter to adapter messaging thus bypassing the ABAP stack of Integration server altogether. This is known to improve performance.
Go through below links:
http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/70b8adc3-728c-2a10-7fad-d43f29074ef8
Local Processing in the Advanced Adapter Engine using PI 7.1
2.     Your problem is solve with u201CPayloadZipBeanu201D
You use this module to compress one or more payloads or extract payloads from a compressed file and can use this module in any adapter.
Go through below links:
http://help.sap.com/saphelp_nw04/helpdata/en/45/da9358a1772e97e10000000a155369/frameset.htm
http://www.****************/Tutorials/XI/ZippingFiles/demo.htm
thanks,

Similar Messages

  • Zip files Exceedingly compressed size message of 30 megabytes are quarantined

    I get this message when I received a zip file over 30 megabytes.
    FILE QUARANTINED
    The original contents of this file have been replaced with
    this message because of its characteristics.
    File name: '172-16-1-4-04SEP14.ZIP'
    Malware name: 'Exceedingly compressed size'
    I have 3 question
    1. How do I turn off the function in Exchange 2013 of blocking ZIPs over 30 megabytes?
    2. Where is the "quarantine" located that the error message is referring to?
    3. If I want to how can I raise the value to 50 megabytes?
    Please do not post information about Forefront on this. This is not a Forefront problem
    Moses Hull of Alexant Systems

    Hi,
    Please check if the below information helps.
    Maximum attachment size in Transport rules that apply to all Mailbox servers in the organization can be created, set and viewed using below command
    Cmdlets to set: New-TransportRule, Set-TransportRule
    Cmdlets to Get: Get-TransportRule
    Parameter: AttachmentSizeOver
    Or in EAC
    Mail flow > Rules > Add
    or
    Edit .
    Use the predicate Apply this rule if > Any attachment >
    is greater than or equal to
    Use the predicate Apply this rule if > The message >
    size is greater than or equal to
    Refer
    http://technet.microsoft.com/en-us/library/bb124345(v=exchg.150).aspx

  • Invalid entry compressed size

    I am writing a program to edit the web.xml file and weblogic.xml file of a war file and create a new war file. below is my code. I am getting error:invalid entry compressed size(expected 320 and got 0bytes)
    public void modifyWar(String inputWar, String outputWar, String filterJar) throws FileNotFoundException,
    IOException {
    FileInputStream fis = new FileInputStream (inputWar);
    FileOutputStream fos = new FileOutputStream(outputWar);
    BufferedInputStream bis = new BufferedInputStream(fis);
    BufferedOutputStream bos = new BufferedOutputStream(fos);
    JarInputStream warsrc= new JarInputStream (bis);
    JarOutputStream wardest = new JarOutputStream (bos);
    int offset = 0;
    //Copy the war file into new one
    for (JarEntry zipentry = warsrc.getNextJarEntry(); zipentry != null; zipentry = warsrc.getNextJarEntry()){
    byte[] buf = new byte[4096 * 16];
    byte[] buf1 = new byte [4096 * 16 ];
    int nBytes = 0;
    System.out.println("zipentry " + zipentry.getName());
    wardest.putNextEntry(zipentry);
    String webChange= zipentry.getName();
    System.out.println ("Size of " + zipentry + " is " + zipentry.getCompressedSize());
    try{
    while ( ( nBytes = warsrc.read(buf,offset,buf.length)) != -1 ){
    if (webChange.endsWith("web.xml") || webChange.endsWith("weblogic.xml"))
    ByteArrayInputStream istr= new ByteArrayInputStrea(buf,offset,nBytes);
    InputSource is = new InputSource (istr);
    buf1= writetowebxml(is);
    wardest.write(buf1,offset,nBytes);
    offset += buf1.length;
    if(nBytes < buf1.length)
    break;
    else{
    wardest.write(buf, offset, nBytes);
    offset += nBytes;
    if(nBytes < buf.length)
    break;
    }catch (IndexOutOfBoundsException e){
    wardest.close();
    public static byte[] writetowebxml(InputSource isnew) {
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    try{
    DocumentBuilderFactory docFactory = DocumentBuilderFactory.newInstance();
    DocumentBuilder docBuilder = docFactory.newDocumentBuilder();
    Document doc = docBuilder.parse(isnew);
    Element root = doc.getDocumentElement();
    Element childElement = doc.createElement("Number");
    root.appendChild(childElement);
    Element NumberName = doc.createElement("Number-name");
    NumberName.appendChild(doc.createTextNode("Counter"));
    childElement.appendChild(NumberName);
    DOMSource source = new DOMSource(doc);
    Result result = new StreamResult(out);
    TransformerFactory factory = TransformerFactory.newInstance();
    Transformer transformer = factory.newTransformer();
    transformer.transform(source, result);
    catch(SAXException e) {
    e.printStackTrace();
    catch(IOException e) {
    e.printStackTrace();
    catch(ParserConfigurationException e) {
    e.printStackTrace();
    catch(TransformerConfigurationException e) {
    e.printStackTrace();
    catch(TransformerException e) {
    e.printStackTrace();
    return (out.toByteArray());
    Thanks,
    Kevin.
    Edited by: user13461628 on Dec 12, 2010 7:06 AM

    Moderator advice: Please read the announcement(s) at the top of the forum listings and the FAQ linked from every page. They are there for a purpose.
    Then edit your post and format the code correctly.
    db

  • Rendering a file at a compressed size without sacraficing quality?

    So as the title suggests I am trying to render out videos on AE in HD quality (720p at least) but for a simple 25 minute video is rendering out at over 500GB (simple video with an audio track overlayed). I need a way to render the video out in a format that will compress the video to a smaller file size but won't lose the desired quality. Using AE with Creative Cloud on a Windows 8.1 OS. Even if the solution requires the use of Adobe's Media Encoder or something similar I am simply looking for a way to shrink file size while maintaining quality, any help anyone can give is greatly appreciated. Thanks in advance.

    I usually render an intermediate file (as it sounds like you did) and then use that file in the Adobe Media Encoder to create my deliverable (h.264, etc.)
    The great thing about this is that I can try different compression settings to try to dial in the quality vs. file size balance without having to re-render a complex AE composition over and over.
    However, if you want to, you can just toss your AE comp straight into AME without the intermediate step. Some people I know do that so they can keep working on the next project in AE while the AME renders the previous one in the background. This is only useful once you've figured out what settings in AME work for your workflow though.

  • Best Compression/Size setting for posting movie on Blog

    Newbie here - Thanks for the help BTW.
    I created a little movie from still pics through IMovie08. I exported it a few different ways to my desktop in order to get it on my IWeb blog page, but many of my friends and family were not able to open the file.
    I believe it's a QT file now. What should I be doing to make this work for everybody? Is there a certain file type that is best for this situation?
    Again - Thank you for any direction you can give me.

    Welcome to the Apple Discussions. What I've done is use the File->Export for Web->Desktop option in Quicktime. Then one way to get it online and have a page load without waiting for the movie to completely load is to place the movie in your iDisk's Movie folder and add a button/link to a poster frame image of the movie via an HTML Snippet that will open the movie in a new window, one that is sized to the movie's size. This demo page is an example of such a button and has the code used in the snippet.
    if the Export to Web is till too large post in the Quicktime to see what you might find there. You can still use the button snippet to make the page load quicker.
    OT

  • Compress and uncompress data size

    Hi,
    I have checked total used size from dba_segments, but I need to check compress and uncompress data size separately.
    DB is 10.2.0.4.

    I have checked total used size from dba_segments, but I need to check compress and uncompress data size separately.
    DB is 10.2.0.4.
    Unless you have actually performed any BULK inserts of data then NONE of your data is compressed.
    You haven't posted ANYTHING that suggests that ANY of your data might be compressed. For 10G compression will only be performed on NEW data and ONLY when that data is inserted using BULK INSERTS. See this white paper:
    http://www.oracle.com/technetwork/database/options/partitioning/twp-data-compression-10gr2-0505-128172.pdf
    However, data, which is modified without using bulk insertion or bulk loading techniques will not be compressed
    1. Who compressed the data?
    2. How was it compressed?
    3. Have you actually performed any BULK INSERTS of data?
    SB already gave you the answer - if data is currently 'uncompressed' it will NOT have a 'compressed size'. And if data is currently 'compressed' it will NOT have an 'uncompressed size'.
    Now our management wants that how much compressed data size is and how much uncompressed data size is?
    1. Did that 'management' approve the use of compression?
    2. Did 'management' review the tests that were performed BEFORE compression was done? Those tests would have reported the expected compression and any expected DML performance changes that compression might cause.
    The time for testing the possible benefits of compression are BEFORE you actually implement it. Shame on management if they did not do that testing already.

  • Document sizes and jpeg compression / quality

    In relation to Photoshop, in the status bar at the bottom it displays what it calls “document sizes”.  Would someone be able to clarify can this be used to determin the quality of a jpeg file ?
    For example if I open up a jpeg with no compression (file size on disk is 4.57mb) it displays Doc:34.5M/34.5M however if I open the same file with compression set at 5 (file size on disk is 748kb) and ‘document size’ doesn’t change.  How does the document size relate to jpeg compression etc...?
    Thank you

    Open a file, say a tiff or psd, zoom in close to the image, about 400% on a recognizable detail. Use the Save As... command, select JPEG, and click Save.
    When  you get to the dialog box, run the Quality slider to 0 and observe what happens to the pixels. It's like they are clumped together in large blocks. 
    That's how it saves on disk space when it is written back to the file.
    477k/477k is the uncompressed size/size in ram and 31.5k is the saved to file compressed size caused by clumping all those pixels together. Of course you trash the file that way,but that's where compression saves space. Not by reducing the pixel count,but by consolidating them.
    My point is is that you adjust that slider by eye and from there decide what optimal quality number is worth the space saved.
    With broadband connections and terabyte drives, I would not see any point to less than 12 quality compression these days.

  • Problems uncompressing compressed jar files

    Hi all,
    I am seeing an unusual issue where some entries don't have full header info. By this I mean, according to the ZIP spec, it is possible that the header info is stored before the file data, so you CAN get the right info. But sometimes, it can be stored as an EXT header, where it is some place AFTER the compressed file data.
    So most of the time we are getting good size/comrpessed size info and can properly get the entry out of the zip file. But sometimes, we get -1 values and are not able to get the entry at all.
    We are working on allowing a classloader to get classes out of an "embeded" zip/jar file. In other words, a jar file can contain embeded zip/jar files (only 1 level deep), and our classloader not only finds the classes of the main jar file (using the URLClassLoader capabilities, but if a class being looked for is not found in the main jar file, we then try to find it in any embeded jar/zip files. Finding the classes is fine, trying to create .class files out of the compressed data from an embeded jar/zip is the problem.
    We are getting NegativeArraySizeExceptions thrown.
    URLConnection connection = resURL.openConnection();
    byte[] classData = new byte[connection.getContentLength()];
    InputStream input = connection.getInputStream();
    The resURL points to an embeded jar/zip within the outer jar file. The exception is thrown in the 2nd line above, when we try to create the byte[] array to hold the class data. The getContentLength() returns -1.
    Any help would be appreciated. Thanks.

    ZipInputStream only reads the local file header (LFH) of an entry if getNextEntry() is called, when the size and compressed size are not in the LFH becourse the sizes were unknow when the LFH was written the sizes are set to -1 and a flag is set to indicate that the entry has an EXT header after the data which contains the sizes.
    So I think the only thing you could do at the moment it to write an extension of ZipInputStream that checks if the flag is set and then looks for the EXT header to get the sizes but the difficult part is to get the data, ZipInputStream uses a PushbackInputStream to read the data but that InputStream doesn't support the mark method.
    Good luck.

  • Save for web&.... the image exceeds the size for ....issue with AI on imac i7

    Hi,
    I just upgraded from using macbook Pro to Imac inte core i7 with 4GB. I use illustrator CS3 for high resolution documents i.e saving a Design for say maybe saving for iphone &ipod skin; a quality jpg that is 1300px(width) by 2000px(height.
    General information
    The issue is I keep getting the pop up with:
    The image exceeds the size Save for Web was designed for. You may experience out of memory erros and slow performance. Are you sure you want to continue?
    I click yes, but get another pop up with:
    Could not complete this operation. The rastarized image exceeded the maximum bounds for save for web.
    And last weekend after NOT being able to save anything I went and bought 16GB memory thinking that this ussue would be solved, but nada.
    I have researched around here and google only to find about scratch disk: went to preferences then to plug-ins and scratch Disks -I seem NOT to get to the bottom of this at all.
    And this morning I was trying to create a pattern Design and could not even get the design on the color swatches as I have done before with macbook pro with 4GB -IT'S NOT FUN, because I am NOT doing it right. Do you care to help me out?
    Thanks a whole bunch in advance
    //VfromB

    Are you including a color profile?  What metadata are you including with it?
    How many pixels (h x v)?
    50 kb is not all that big.  Does your image have a lot of detail in it?  Content can affect final compressed size.
    -Noel

  • Dbms_redefinition package and COMPRESS attribute of the target table

    Hi experts,
    we have an already partitioned and compressed table under Oracle 10g R2 wich we want to move to another tablespace using online redefinition.
    The table should keep the Partitions and compressed data after the move.
    My question is: How much storage we must have in place for the move of the table ?
    Example:
    tab (compressed size) : 1000 MB
    tab (uncompressed size) : 4000 MB
    Seems it depends on how redefinition handles the move of the compressed data.
    So if redefinition uses INSERT /* APPEND */ .... is should be roundabout 1000 MB ("compress during write")
    Is this assumption correct ?
    Can anybody shed some light on redefinition wich kind of compression-conserving stragetgy it uses ?
    bye
    BB

    From the 11.2 admin guide:
    Create an empty interim table (in the same schema as the table to be redefined) with all of the desired logical and physical attributes. If columns are to be dropped, do not include them in the definition of the interim table. If a column is to be added, then add the column definition to the interim table. If a column is to be modified, create it in the interim table with the properties that you want.The table being redefined remains available for queries and DML during the entire process.
    Execute the FINISH_REDEF_TABLE procedure to complete the redefinition of the table. During this procedure, the original table is locked in exclusive mode for a very short time, independent of the amount of data in the original table. However, FINISH_REDEF_TABLE will wait for all pending DML to commit before completing the redefinition.>
    If you did not want to create an interim table, then this approach is not going to work for you. There is no requirement for you to create anything other than the interim table, and any dependent objects can be done automatically, including materialized views. Where did you see that you have to create mview logs?

  • Disk Utility: after creating the .DMG (compressed image) When "scan image for restore" finishes it displays "unable to scan (internal error)". Since the process was scanning for about 1 hr is this a fake warning?

    External 500GB drive is partitioned with 400GB for Mac OS extended (journalling) and 2nd partition formatted today as "ExFAT".
    Yesterday I tried to create a disk image of my HD to no avail - this was compressed on a FAT partition. I got a warning after running the <scan image for restore> saying "unable to scan (internal error)"
    I rang Apple support who suggested formatting a partition as MAC OS extended (journalling) and choose to save the image as "read/write'. The image process stopped writing at about 200GB and then the file size jumped to 499GB (ie the size of the HD). But the process did nothing for about an hour so I foce quit disk utility (as suggested by apple support)
    My current attempt was to format the external drive again; this time with a 400GB Mac OS extended (journal) format; plus a 100GB ExFAT format partition. The new image was created after about 2.5hrs with a compressed size of 149GB. However, after running the "scan image for restore" facility I got the warning message "unable to scan filename " (internal error). The process was scanning the image file according to the progress bar - this took about hour - thus the warning is a puzzle.
    If anyone can say more about this error I would really appreciate knowing tips for using disk utility. The screen grab is below.
    "unable to scan filename " (internal error)
    thanks

    Thanks for a quick reply <baltwo>. You’ve answered that it is not a fake message!
    fyi: I thought that I would use my external Buffalo for two purposes; namely the smaller ExFAT partition for copying stuff to my win7 netbook. The larger partition was to be used as my imac image of SL. So this was my first attempt at using disk utility - the idea being to hold an image backup at another location. Plus I now have the Lion thumbdrive ready to upgrade from SL and so being cautious wanted to have a SL image backup
    Anyway I followed the apple instructions using my mac 10.6.3 install DVD but it would appear from other discussions that <unable to scan (internal error)> is not unique to me.
    I’ll get back to apple telephone support for further advice.
    I would though be grateful for an “idiots’ guide to making an image backup of my internal HD that runs SL 10.6.8
    eg should I have just one partition on the external USB drive? format it as Mac OS extended? create the image as compressed or read/write? Basically I’ll do whatever will work!!
    thanks again

  • Photo compression for iweb

    A question that i can't find the answer to...
    I subscribe to a training library called lynda.com which has fantastic training on iweb.
    The one area they don't cover is this.
    If i want to create a photo page from within iphoto and just put it on iweb, i have photos that I have not manually compressed size-wise for web compatibility. If I then just either make a slide show for iweb, or just use the media browser in iweb and drag them into placeholders will the end user find the speed of the site slow as i haven't compressed photos for web use, or does either iphoto or iweb do that for me?
    In other words, do i need to take photos into Photoshop, crop and save for web first, and then put them in iphoto, or is that just not necessary?
    David Tobin

    60 pics is way too many for one page. 15 or less should be your goal.
    This site will tell you how long your page will take to load in the different internet connections. If it says over 15 seconds in 56k you have overloaded your page and should break it up.
    http://www.websiteoptimization.com/services/analyze/
    Kurt

  • Emailing Photos not at Full and Original Size? Why...

    Am trying to send photos to friends and family but the photo quality and size gets reduced by soooo much.
    An iphone gives an option if I want to send the photo at compressed size or original size.
    Why does the Windows 8 Nokia 920 compress photos and how can you turn it off?
    Uploading pictures on usb to computer and then emailing photos from computer to someone is a bit of hassle considering Phones can skip and send photos themselves.

    I believe that the natural limit for the MMS standard (1.2 or 1.3) that most phones have in use these days is 600KB, so if you are sending it directly then it will downscale the image to the smaller size in order to comply with the data size limit. If you are using SkyDrive app on your 920, there are Settings for photo upload and photo download sizes.
    For now, there is not much that can be done except upload it to SkyDrive and send people a link to that file...which can be cumbersome I know. Reasons for compressing it may be to lessen the load on data plans that users have.
    Anyways, as this is a user to user forum only for Nokia products and services, and what you are asking about refers to core WP8 behaviour, you are always welcome to leave suggestions for them at: http://windowsphone.uservoice.com
    I don't see why it can't

  • [Solved] Compression Problem with 7z + Peazip

    I like to compress a file with 7zip, which doesn't succeed.
    It has a size of 4.4GB and after compression it got about 3,8GB.
    If I choose split the file to 1DVD size, it will succeed with a *.7z.001, this is why I know the compressed size. But if I choose single volume, it will give me this error:
              7-Zip [64] 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-18
              p7zip Version 9.20 (locale=en_US.UTF-8,Utf16=on,HugeFiles=on,4 CPUs)
              Scanning
              Creating archive /home/xxx/xxx/file.7z
              Compressing  file.iso
    System error:
    E_FAIL               
    The command is:
    /usr/lib/peazip/res/7z/7z a -t7z -m0=LZMA -mmt=on -mx5 -md=16m -mfb=32 -ms=2g -w '/home/xxx/xxx/file.7z' '/home/xxx/xxx/file.iso'
    The code which will succeed with split file is:
    /usr/lib/peazip/res/7z/7z a -t7z -m0=LZMA -mmt=on -mx5 -md=16m -mfb=32 -ms=2g -v4699324416 -w '/home/xxx/xxx/file.7z' '/home/xxx/xxx/file.iso'
    Ark compressor also crashes when trying to compress this file, and so is fileroller. This is why I guess the problem lies in 7z, but I don't actually know for sure, because if I use peazip and try to compress the file to a single volume *.zip, it will also crash. But on the other hand I think zip compression is also done by 7z.
    Compression ratio is not relevant, it will crash even of store.
    It doesn't crash if I use compressor to create *.pea, which I don't want to use.
    The compression will succeed with other files, but not with all. From 11 files I try to compress, it succeeded with 6, it didn't succeed with 5. All files are 4,4Gb before compression.
    My file systen is ext4, I tried to copy the files to a different hdd also on ext4 and another one with ntfs, no change.
    I am out of ideas what I should do, I only can guess the problem is with 7zip somewhere.
    Last edited by alocacoc (2014-03-06 06:39:50)

    OK, when I compress with peazip to *.7z, it will fill up the directory /tmp, which is 2GB, which is half of my RAM size I guess, till its full. But when I do a split file (to produce a *.7z.001 file with a split limit of 4,4GB (this will also produce one file only), it will not use /tmp.
    So maybe the problem is that at *.7z time it wants to cache it to /tmp, and at the split file it will cache to the destenation directory.
    This was guessable also the problem why ark and fileroller crashed before, since I kept the peazip error window open and it won't release its temp file in /tmp till its closed and the whole system reacted strangely in this time.
    Maybe I have to wait for an update on Peazip or 7zip to see if the problem will be fixed.
    Last edited by alocacoc (2013-11-24 22:38:51)

  • Do I need to reduce the images size in pixels before upload them to M.Me?

    Do I need to reduce the images size in pixels (actual size jpeg 5616 × 3744 pixels/766kb) before upload them from iPhoto to a Mobile Me Gallery?
    Otherwise they will be heavy or the upload process take care of that?
    It would be nice that iPhoto took care and optimize the images to display online without any more work, because I´ve 1200 images to upload and they´re jpegs 5616 × 3744 pixels. It´s just to viewing purposes. Not o download or print.
    Thanks.

    In fact no matter if I use a High or Medium compression size JPEG, the size after upload is the same.
    Starts with 1,5 MB/5600px (high) or 780kb/5600px (medium) JPEG and ends after upload in 115kb for a 1024px image.
    This means that iPhoto auto compress in order to publish to Mobile Me, even if we start with a bigger file.
    I hope this info is useful to others.
    Thanks.

Maybe you are looking for