Compress data through a pipe

Hi all,
I have a command which is going to spew a whole lot of output (>1 G). Rather than writing that all to disk and then compressing it with tar, is there a way to compress straight thru a pipe?
I'd like to be able to do something like:
command | tar -cvzf -
but of course that doesn't work because tar is looking for a file name, not a stream of data.
Any ideas?
Thanks!

Thanks! I got gzip to work. Strange, I had tried it before but was doing something wrong I guess.
But I'm not so sure about tar only working with streams. If that's so, then what's wrong with any of this?
+mbp:~ me$ ls -al | tar -cvz > tared8+
+tar: Cowardly refusing to create an empty archive+
+Try `tar --help' or `tar --usage' for more information.+
+mbp:~ me$ ls -al | tar -cvzf tared8+
+tar: Cowardly refusing to create an empty archive+
+Try `tar --help' or `tar --usage' for more information.+
+mbp:~ me$ ls -al | tar -cvzf tared8 -+
+tar: -: Cannot stat: No such file or directory+
+tar: Error exit delayed from previous errors+

Similar Messages

  • Compressing data through URLConnection

    I was looking into to the URLConnection and try to get a way to configure the connection(set my own sockets so that I can compress data going back and forth) similar to the way RMI handles this issue by providing clientSocketFactory, and serversocketFactory to UnicastRemoteObject. It seems there is no way to do that. I know I can specify URLStreamHandlerFactory but that does not seem to do what I am looking for.
    I am looking for a way to control the underlying communication mechanism in which the connection I get from URL.openConnection() uses. That is possible if the API would provide a way to pass <mechanism>Factories to the URL. if any one has a solution to this please email it to me.
    I will give you some code to see what I am talking about.
    //Servlet
    import java.io.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    public class DataCruncherServlet extends HttpServlet {
    public void init(ServletConfig config) throws ServletException {
    super.init(config);
    public void doGet(HttpServletRequest req, HttpServletResponse res)
    throws ServletException, IOException {
    doPost(req, res);
    public void doPost(HttpServletRequest req, HttpServletResponse res)
    throws ServletException, IOException {
    ServletInputStream in = req.getInputStream();
    InputStreamReader inr = new InputStreamReader(in);
    StringBuffer sb = new StringBuffer();
    char data[] = new char[1024];
    while( inr.read(data) != -1 ) {
    sb.append(new String(data));
    OutputStream out = res.getOutputStream();
    out.write(sb.toString().getBytes());
    in.close();
    inr.close();
    out.close();
    //For The client
    import java.net.*;
    import java.io.*;
    public class DataCruncherClient {
    public static void main(String[] args) {
    URL fileURL = null;
    URLConnection con = null;
    StringBuffer buffer = null;
    OutputStream out = null;
    BufferedReader br = null;
    InputStreamReader in = null;
    try {
    fileURL = new URL("http://localhost:8000/myContext/DataCruncherServlet");
    //There is no way to control the underlying communication mechanism(sockets rmi, ...)
    //connection we get uses.
    con = fileURL.openConnection();
    con.setDoOutput(true);
    con.setDoInput(true);
    out = con.getOutputStream();
    br = new BufferedReader(new FileReader("test.txt"));
    StringBuffer sb = new StringBuffer();
    String line = null;
    while( (line = br.readLine()) != null){
    sb.append(line);
    out.write(sb.toString().getBytes());
    in = new InputStreamReader(con.getInputStream());
    char data[] = new char[1024];
    buffer = new StringBuffer();
    while( in.read(data) != -1 ) {
    buffer.append(new String(data));
    catch(MalformedURLException me) {
    System.out.println("MalFormed URLException: "+me.getMessage());
    catch(FileNotFoundException fe) {
    System.out.println("File Not Found: "+fe.getMessage());
    catch(IOException ioex){
    System.out.println("IOEXception: "+ioex.getMessage());
    finally{
    try {
    out.close();
    br.close();
    in.close();
    catch(IOException ioex) {
    System.out.println("can not close stream: "+ioex.getMessage());
    System.out.println("Returned from Servlet is: ");
    System.out.println(buffer.toString());
    email me :[email protected]

    s.append(char[]) and s.append(char[], int, int) are
    similar each one is converted into string throught
    String.valueOf(char[]) and String.valueOf(char[], int,
    int) respectively then it is appended to the
    s(stringBuffer) so I do not see why one is more
    efficient Than the other, please explain.from the implementation of StringBuffer:
        public synchronized StringBuffer append(char str[], int offset, int len) {
            int newcount = count + len;
         if (newcount > value.length)
             expandCapacity(newcount);
         System.arraycopy(str, offset, value, count, len);
         count = newcount;
         return this;
        }where do you see conversion to String? typically you do multiple appends and then one StringBuffer.toString(). this is different from creating a String on every append.
    Using Zip streams can fix the particular problem I
    outlined but I was thinking about a way to control
    the underlying communication mechanizim in which the
    connection(URLConnection) depend on.so you wanted to hide the compression inside of URL.openStream()?
    robert

  • How to Set Basic compression attribute through Oracle ILM

    Hi,
    I have log a SR with Oracle with regard to this but they redirect to this forum.
    We have configured Oracle ILM in our environment. We have a requirement that after 3 years the data to be moved to a Low Cost Storage. We tested the same through Oracle ILM and it works fine for the above scenario.
    We have another requirement that after moved to Low Cost Storage Tier through Oracle ILM we also want to compress the data.
    When we set the compressed attribute through Oracle ILM it alwasy generated the script
    like the one below
    alter table test_user.range_part
    move partition year4
    tablespace part3
    compress for all operations -----------------Oracle Advanced Compression--------------
    update indexes;
    But we want some thing like below through Oracle ILM
    alter table test_user.range_part
    move partition year4
    tablespace part3
    compress -------------------Basic Compression----------------
    update indexes;
    Can you please help us how to set the Basic compression attribute through Oracle ILM.
    Thanks and Regards
    Ganesan Sivaraman

    Oracle support referred you here?
    Please post the SR number here or send it to me by email for follow-up (damorgan12c (at) gmail.com).
    Thank you.

  • Loading data through rule files in Essbase

    Hi everyone,
    I really need help because something that I don't understand is happening when I am loading data through a rule files that I have created in Essbase. In the "Field Properties" of that rules files, I added few accounts in the "Global Properties" that I wanted to be replaced by others when I perform the load since I had troubles with the hierarchy of my accounts. My problem is that I have 3 accounts that I cannot add when I am loading my data and I don't understand why. By example, in my data extract file, I want to add the account : 601820SN60005 but since it doesn't exist in my Dimension Library I changed my rule file. I add the following propertie : Replace 601820SN60005 with 601820, so the system should put the data in that account instead. But it's not what'S happening. When I perform my load, the log tells me that the member 601820SN60005 does not exist in the database which is true but it seems that the propertie I had is not working. The load should work property. Is it possible that my rules files is corrupted? Or anything else?
    Please give me a clue of what's happening here because I really don't understand!
    Thanks a lot!

    I will just pipe in with the mostly worthless comment that doing any kind of ETL within a load rule is really not the world's best idea. Could you do your transformations in the source? There is this fantastic language called "SQL" that allows all kinds of cool data manipulations. Oracle even sells this product called ODI that I hear is just dandy for ETL work. :)
    A slightly more useful suggestion -- it's really tough to view all of the various transformations in a load rule, as you have found to your sorrow. Did you know you can print the load rule and get all of the transformations? I use it all the time when a client hasn't listened to my whine about not doing ETL in a load rule.
    Regards,
    Cameron Lackpour

  • Lowering Compression Data Rate

    I hope someone can help. I'm am getting a message that reads Warning-Dropped frames then suggests I turn off RT Unlimited (which I've done) then "Lowering Compression Data Rate". I haven't a clue what that means or how to do it. I'd appreciate any insight.
    TAB

    Final Cut Express HD 3.0
    Old VHS footage run through a Directors Cut converter. I'm on a G4 OS ver. 10.4.6.
    Disk Description : IBM-IC35L060AVVA07-0 Total Capacity : 57.3 GB (61,492,838,400 Bytes)
    Connection Bus : ATA Write Status : Read/Write
    Connection Type : Internal S.M.A.R.T. status : Verified
    Connection ID : Device 0
    Disk Description : WDC WD2000BB-00DWA0 Total Capacity : 186.3 GB (200,049,647,616 Bytes)
    Connection Bus : ATA Write Status : Read/Write
    Connection Type : Internal
    Connection ID : Device 0

  • Compression Data Rate.

    Forum,
    I am editing some footage and receive the warning,among others,
    to lower the" Compression Data Rate".
    Well, I have spent two hours going through "Help" and can not find how to do this.
    Would some one direct me to the area where this option is available?
    Thanks,
    Michael.

    Kevan,
    Thank you for your reply.
    I have a MAc Pro 8 core.
    The Mac HD and drive with video files are both 1.5 TB and have a speed of 7200rpm. with plenty of spare space and recently defragged.
    The footage is from a book titled FCP 7 "advanced editing"
    I presume it is NTSC but there is indication that is so.
    Your reply prompted me to search further and at last found "data rate" in" help"
    Every thing I need is there.
    I will mark solved.
    Michael.

  • Can't load data through smart view (ad hoc analysis)

    Hi,
    There is EPM application where I want to give ability to planners to load data through smart view (ad hoc analysis). In Shared Services there are four options in
    EssbaseCluster-1: Administrator, Create/Delete Application, Server Access, Provisioning Manager. Only Administrator can submit data in smart view (ad-hoc analysis). But I don't want to grant Essbase administrator to planners, I'm just interested to give them ability to load data through ad-hoc analysis. Please suggest!

    I take that you refreshed the Planning security, If not refresh the security of those users. Managing Security Filters
    Check in EAS whether those filters are created with "Write" permissions.
    Regards
    Celvin
    http://www.orahyplabs.com

  • I am receiving the data through the rs232 in labview and i have to store the data in to the word file only if there is a change in the data and we have to scan the data continuasly how can i do that.

    i am receiving the data through the rs232 in labview and i have to store the data in to the word or text file only if there is a change in the data. I have to scan the data continuasly. how can i do that. I was able to store the data into the text or word file but could not be able to do it.  I am gettting the data from rs232 interms of 0 or 1.  and i have to print it only if thereis a change in data from 0 to 1. if i use if-loop , each as much time there is 0 or 1 is there that much time the data gets printed. i dont know how to do this program please help me if anybody knows the answer

    I have attatched the vi.  Here in this it receives the data from rs232 as string and converted into binery. and indicated in led also normally if the data 1 comes then the led's will be off.  suppose if 0 comes the corresponding data status is wrtten into the text file.  But here the problem is the same data will be printed many number of times.  so i have to make it like if there is a transition from 1 to o then only print it once.  how to do it.  I am doing this from few weeks please reply if you know the answer immediatly
    thanking you 
    Attachments:
    MOTORTESTJIG.vi ‏729 KB

  • How do I access session data through an EJB?

    Hi
    How do I access session data through an EJB?
    I am currantly developing a Web service (using ejb's, JBoss.net and Apache Axis). A client making a call to this Web service, is expecting a bussiness-object in return. My problem is that this bussiness-object i stored in a users session data. How do I retrieve this bussiness-object from the users session.
    I have read that this does not work with httpsessions, is this true? If this is true, is it possible to store the bussiness object in a JavaBean e.g:
    <jsp:useBean id="userContextWebImpl" scope="session" class="com.ac.march.client.UserContextWebImpl">
    <%
    String key = "test";
    String value = "This is the value";
    userContextWebImpl.setValue( key, value1 );
    %>
    </jsp:useBean>
    and then retrieve this information through the EJB? Or is it possible to do this by using Statfull JavaBeans? Or can this be done through a nother solution?
    Please help!

    I have created a JavaBean with scope="application" to store some data. The data is stored when a user prefomes a spesific task.
    A different person then makes a call to a Web-Service on the server. The Web-Service then asks an EJB to retrieve the data stored in the JavaBean (servlet cotext). In other words: How do I retrieve this data from the EJB?
    I have tried with this code, but with no luck.
    (ApplicationContextWebImpl is the JavaBean)
    public static String getBookingResult( String key )
         String myResult = null;
         String myKey = key;
         ApplicationContextWebImpl applicationContextWebImpl = null;
         try
              applicationContextWebImpl = new ApplicationContextWebImpl();
              myResult = (String)applicationContextWebImpl.getValue( key );
         catch ( java.rmi.RemoteException e )
         return myResult;
    }

  • Report and data comming wrong after compress data with full optimization

    In SAP BPC 5.1 version to increase the sysetm performance we did full optimization with compress data.
    Theis process end with error, after login into system the report and values comming wrong,
    What is the wrong,how to rectify it
    Regards
    prakash J

    This issue is resolved,

  • Importing of BPPayment Methods data through DTW

    HI All,
        Please explain how to upload BPPaymentMethods data through DTW?
    Kind Regards
      Silpa.N

    You can use oBPPaymentMethods template to reach your goal through DTW.  There are only 3 column needed to fill the teplate.
    RecordKey is identifier. It has to be unique.
    Linenum is (line number - 1) in that record.
    PaymentMethodCode is Incoming or Outgoing
    Hope this info is helpful to you.
    Thanks,
    Gordon

  • How to pass delivery date through BAPI while creating a sale order

    Dear frndz,
         I am using 'BAPI_SALESORDER_CREATEFROMDAT1'
    to create a sale order .
        I don't have any problem..
        But I have to pass schedule line delivery date through this bapi .
       I used REQ_DATE in structure BAPISCHDL.
       But I can' t get it.
       Through which parameter can i meet this..
       The sale order should be created line item wise along with my delivery date..
      Any suggestions...
    regards.
    siva

    Dear frnd,
        Danq for your response..I can't use DLV_DATE for this requirement..
        But I used REQ_DATE in the structure BAPISCHEDULE .
       I came to know that the problem i faced previously  was only
    internal data conversion.
        Now am able to pass my delivery date..
        so I am closing the thread..
    Regards.
    siva

  • Standard transaction CG02 is not saving the data through F4 help.

    Hi Guru's,
    I have a standard transaction CG02 for this i need to maintain the data for few of the fileds like Heat sensitive, Thermal energy hazard etc for updating Derived C-Alpha code. So for i can able to maintain data in two ways either directly or through F4 help. when i supply data through F4 help, on the F4 help window im providing the value and clicking on continue button. when i press on the contine button it has to update the Derived C-Alpha code if all the required fields filled. If suppose i do nt fill the required fileds it has to through the error message on the status bar like 'Invalid Data Combination'. Now the problem is when i clicking on continue button of F4 help window im displaying error message. when i try to change the value or try to fill the required fields so for im clicking on F4 button then its directly taking me out from the transaction. If i press on any key also samething is happening but the same is not happening when i supplied the value directly and saving. Please let me know the reason and help me for the same with needful solution. If anyting required plz ask me i wil provide u.
                          Here we are using the user-exit EXIT_SAPLC107_001 and is used in FM C107_CUSTOMER_FUNCTION_CALL.
    Thanks in advance.
    Yours,
    Somu.

    Hi Guru's,
    I have a standard transaction CG02 for this i need to maintain the data for few of the fileds like Heat sensitive, Thermal energy hazard etc for updating Derived C-Alpha code. So for i can able to maintain data in two ways either directly or through F4 help. when i supply data through F4 help, on the F4 help window im providing the value and clicking on continue button. when i press on the contine button it has to update the Derived C-Alpha code if all the required fields filled. If suppose i do nt fill the required fileds it has to through the error message on the status bar like 'Invalid Data Combination'. Now the problem is when i clicking on continue button of F4 help window im displaying error message. when i try to change the value or try to fill the required fields so for im clicking on F4 button then its directly taking me out from the transaction. If i press on any key also samething is happening but the same is not happening when i supplied the value directly and saving. Please let me know the reason and help me for the same with needful solution. If anyting required plz ask me i wil provide u.
                          Here we are using the user-exit EXIT_SAPLC107_001 and is used in FM C107_CUSTOMER_FUNCTION_CALL.
    Thanks in advance.
    Yours,
    Somu.

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • SAP BPC 7.5 SP3 and Citrix Entering Data through BPC Client and Citrix

    Hello BPC experts,
    my client is using SAP BPC 7.5 SP3 in combination with Citrix. We are having problems now, while entering data through the BPC Client (installed on BPC Server) and the BPC Client (running on Citrix).
    When we enter data through the BPC Client (Server) and expand the sheet, we can only see the data that we entered through the BPC client (Server). When entering data through the BPC Client (Citrix), we can see only the data we enter through the BPC Client (Citrix).
    The database, however, saves both entries correctly. For test purposes we created a test report which shows both entries correctly in the BPC Client on the server and on Citrix.
    Does anyone have an idea what can be wrong with our system? We tried to create an EvDRE log File for this BPC version as well, but failed. Maybe one of you already did that and can help me with this.
    Thanks in advance!

    You can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) of BPC which you are using.
    Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
    Thanks and best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

Maybe you are looking for

  • Overclock RAM on MSI Z97 Gaming 9 AC

    Corsair 2 x 4GB CMD8GX3M2A1600C8 Now I run the ram at 2133MHz With Memory try it in Bios. (performance mode) and game like FIFA 15 running smooth. controller more responsive. but after play 20 minute back to delay controller.but after play 20 minute

  • Need to find invalid coverage period

    Hi Gurus I have the following data: Creation of Table DROP TABLE cal_rules; DROP TABLE cal_rules_dtl; CREATE TABLE cal_rules rule_id NUMBER(10), effective_date DATE, termination_date DATE); CREATE TABLE cal_rules_dtl rule_id NUMBER(10), as_of_date DA

  • The help instructions here don't match my 31.3.0 tbird. Where is help at?

    When I come here for help in doing something I find unusual, I get instructions that don't match the menu choices I have on my desktop client Thunderbird 31.3.0. I'm lost trying to find instructions that match my version. What am I missing here?

  • Problems in partial refresh

    Hi everyone In my jsp page(second page) ,I have 2 Radio buttons and I am getting some parameters from the previous jsp page(first page). I have to refresh partially the second jsp page based on the click event of any radio buttons placed there.But wh

  • When should i replace airport?

    MY airport approximately 12 years old does not hold a devices. Devices do not automatically detect.  I need to access setting, choose network, select several time and hold before it catches or connects.  It's like trying to hang a heavy item to a wal