Compressing data through URLConnection

I was looking into to the URLConnection and try to get a way to configure the connection(set my own sockets so that I can compress data going back and forth) similar to the way RMI handles this issue by providing clientSocketFactory, and serversocketFactory to UnicastRemoteObject. It seems there is no way to do that. I know I can specify URLStreamHandlerFactory but that does not seem to do what I am looking for.
I am looking for a way to control the underlying communication mechanism in which the connection I get from URL.openConnection() uses. That is possible if the API would provide a way to pass <mechanism>Factories to the URL. if any one has a solution to this please email it to me.
I will give you some code to see what I am talking about.
//Servlet
import java.io.*;
import javax.servlet.*;
import javax.servlet.http.*;
public class DataCruncherServlet extends HttpServlet {
public void init(ServletConfig config) throws ServletException {
super.init(config);
public void doGet(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException {
doPost(req, res);
public void doPost(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException {
ServletInputStream in = req.getInputStream();
InputStreamReader inr = new InputStreamReader(in);
StringBuffer sb = new StringBuffer();
char data[] = new char[1024];
while( inr.read(data) != -1 ) {
sb.append(new String(data));
OutputStream out = res.getOutputStream();
out.write(sb.toString().getBytes());
in.close();
inr.close();
out.close();
//For The client
import java.net.*;
import java.io.*;
public class DataCruncherClient {
public static void main(String[] args) {
URL fileURL = null;
URLConnection con = null;
StringBuffer buffer = null;
OutputStream out = null;
BufferedReader br = null;
InputStreamReader in = null;
try {
fileURL = new URL("http://localhost:8000/myContext/DataCruncherServlet");
//There is no way to control the underlying communication mechanism(sockets rmi, ...)
//connection we get uses.
con = fileURL.openConnection();
con.setDoOutput(true);
con.setDoInput(true);
out = con.getOutputStream();
br = new BufferedReader(new FileReader("test.txt"));
StringBuffer sb = new StringBuffer();
String line = null;
while( (line = br.readLine()) != null){
sb.append(line);
out.write(sb.toString().getBytes());
in = new InputStreamReader(con.getInputStream());
char data[] = new char[1024];
buffer = new StringBuffer();
while( in.read(data) != -1 ) {
buffer.append(new String(data));
catch(MalformedURLException me) {
System.out.println("MalFormed URLException: "+me.getMessage());
catch(FileNotFoundException fe) {
System.out.println("File Not Found: "+fe.getMessage());
catch(IOException ioex){
System.out.println("IOEXception: "+ioex.getMessage());
finally{
try {
out.close();
br.close();
in.close();
catch(IOException ioex) {
System.out.println("can not close stream: "+ioex.getMessage());
System.out.println("Returned from Servlet is: ");
System.out.println(buffer.toString());
email me :[email protected]

s.append(char[]) and s.append(char[], int, int) are
similar each one is converted into string throught
String.valueOf(char[]) and String.valueOf(char[], int,
int) respectively then it is appended to the
s(stringBuffer) so I do not see why one is more
efficient Than the other, please explain.from the implementation of StringBuffer:
    public synchronized StringBuffer append(char str[], int offset, int len) {
        int newcount = count + len;
     if (newcount > value.length)
         expandCapacity(newcount);
     System.arraycopy(str, offset, value, count, len);
     count = newcount;
     return this;
    }where do you see conversion to String? typically you do multiple appends and then one StringBuffer.toString(). this is different from creating a String on every append.
Using Zip streams can fix the particular problem I
outlined but I was thinking about a way to control
the underlying communication mechanizim in which the
connection(URLConnection) depend on.so you wanted to hide the compression inside of URL.openStream()?
robert

Similar Messages

  • Compress data through a pipe

    Hi all,
    I have a command which is going to spew a whole lot of output (>1 G). Rather than writing that all to disk and then compressing it with tar, is there a way to compress straight thru a pipe?
    I'd like to be able to do something like:
    command | tar -cvzf -
    but of course that doesn't work because tar is looking for a file name, not a stream of data.
    Any ideas?
    Thanks!

    Thanks! I got gzip to work. Strange, I had tried it before but was doing something wrong I guess.
    But I'm not so sure about tar only working with streams. If that's so, then what's wrong with any of this?
    +mbp:~ me$ ls -al | tar -cvz > tared8+
    +tar: Cowardly refusing to create an empty archive+
    +Try `tar --help' or `tar --usage' for more information.+
    +mbp:~ me$ ls -al | tar -cvzf tared8+
    +tar: Cowardly refusing to create an empty archive+
    +Try `tar --help' or `tar --usage' for more information.+
    +mbp:~ me$ ls -al | tar -cvzf tared8 -+
    +tar: -: Cannot stat: No such file or directory+
    +tar: Error exit delayed from previous errors+

  • How to Set Basic compression attribute through Oracle ILM

    Hi,
    I have log a SR with Oracle with regard to this but they redirect to this forum.
    We have configured Oracle ILM in our environment. We have a requirement that after 3 years the data to be moved to a Low Cost Storage. We tested the same through Oracle ILM and it works fine for the above scenario.
    We have another requirement that after moved to Low Cost Storage Tier through Oracle ILM we also want to compress the data.
    When we set the compressed attribute through Oracle ILM it alwasy generated the script
    like the one below
    alter table test_user.range_part
    move partition year4
    tablespace part3
    compress for all operations -----------------Oracle Advanced Compression--------------
    update indexes;
    But we want some thing like below through Oracle ILM
    alter table test_user.range_part
    move partition year4
    tablespace part3
    compress -------------------Basic Compression----------------
    update indexes;
    Can you please help us how to set the Basic compression attribute through Oracle ILM.
    Thanks and Regards
    Ganesan Sivaraman

    Oracle support referred you here?
    Please post the SR number here or send it to me by email for follow-up (damorgan12c (at) gmail.com).
    Thank you.

  • Lowering Compression Data Rate

    I hope someone can help. I'm am getting a message that reads Warning-Dropped frames then suggests I turn off RT Unlimited (which I've done) then "Lowering Compression Data Rate". I haven't a clue what that means or how to do it. I'd appreciate any insight.
    TAB

    Final Cut Express HD 3.0
    Old VHS footage run through a Directors Cut converter. I'm on a G4 OS ver. 10.4.6.
    Disk Description : IBM-IC35L060AVVA07-0 Total Capacity : 57.3 GB (61,492,838,400 Bytes)
    Connection Bus : ATA Write Status : Read/Write
    Connection Type : Internal S.M.A.R.T. status : Verified
    Connection ID : Device 0
    Disk Description : WDC WD2000BB-00DWA0 Total Capacity : 186.3 GB (200,049,647,616 Bytes)
    Connection Bus : ATA Write Status : Read/Write
    Connection Type : Internal
    Connection ID : Device 0

  • Compression Data Rate.

    Forum,
    I am editing some footage and receive the warning,among others,
    to lower the" Compression Data Rate".
    Well, I have spent two hours going through "Help" and can not find how to do this.
    Would some one direct me to the area where this option is available?
    Thanks,
    Michael.

    Kevan,
    Thank you for your reply.
    I have a MAc Pro 8 core.
    The Mac HD and drive with video files are both 1.5 TB and have a speed of 7200rpm. with plenty of spare space and recently defragged.
    The footage is from a book titled FCP 7 "advanced editing"
    I presume it is NTSC but there is indication that is so.
    Your reply prompted me to search further and at last found "data rate" in" help"
    Every thing I need is there.
    I will mark solved.
    Michael.

  • Can't load data through smart view (ad hoc analysis)

    Hi,
    There is EPM application where I want to give ability to planners to load data through smart view (ad hoc analysis). In Shared Services there are four options in
    EssbaseCluster-1: Administrator, Create/Delete Application, Server Access, Provisioning Manager. Only Administrator can submit data in smart view (ad-hoc analysis). But I don't want to grant Essbase administrator to planners, I'm just interested to give them ability to load data through ad-hoc analysis. Please suggest!

    I take that you refreshed the Planning security, If not refresh the security of those users. Managing Security Filters
    Check in EAS whether those filters are created with "Write" permissions.
    Regards
    Celvin
    http://www.orahyplabs.com

  • I am receiving the data through the rs232 in labview and i have to store the data in to the word file only if there is a change in the data and we have to scan the data continuasly how can i do that.

    i am receiving the data through the rs232 in labview and i have to store the data in to the word or text file only if there is a change in the data. I have to scan the data continuasly. how can i do that. I was able to store the data into the text or word file but could not be able to do it.  I am gettting the data from rs232 interms of 0 or 1.  and i have to print it only if thereis a change in data from 0 to 1. if i use if-loop , each as much time there is 0 or 1 is there that much time the data gets printed. i dont know how to do this program please help me if anybody knows the answer

    I have attatched the vi.  Here in this it receives the data from rs232 as string and converted into binery. and indicated in led also normally if the data 1 comes then the led's will be off.  suppose if 0 comes the corresponding data status is wrtten into the text file.  But here the problem is the same data will be printed many number of times.  so i have to make it like if there is a transition from 1 to o then only print it once.  how to do it.  I am doing this from few weeks please reply if you know the answer immediatly
    thanking you 
    Attachments:
    MOTORTESTJIG.vi ‏729 KB

  • How do I access session data through an EJB?

    Hi
    How do I access session data through an EJB?
    I am currantly developing a Web service (using ejb's, JBoss.net and Apache Axis). A client making a call to this Web service, is expecting a bussiness-object in return. My problem is that this bussiness-object i stored in a users session data. How do I retrieve this bussiness-object from the users session.
    I have read that this does not work with httpsessions, is this true? If this is true, is it possible to store the bussiness object in a JavaBean e.g:
    <jsp:useBean id="userContextWebImpl" scope="session" class="com.ac.march.client.UserContextWebImpl">
    <%
    String key = "test";
    String value = "This is the value";
    userContextWebImpl.setValue( key, value1 );
    %>
    </jsp:useBean>
    and then retrieve this information through the EJB? Or is it possible to do this by using Statfull JavaBeans? Or can this be done through a nother solution?
    Please help!

    I have created a JavaBean with scope="application" to store some data. The data is stored when a user prefomes a spesific task.
    A different person then makes a call to a Web-Service on the server. The Web-Service then asks an EJB to retrieve the data stored in the JavaBean (servlet cotext). In other words: How do I retrieve this data from the EJB?
    I have tried with this code, but with no luck.
    (ApplicationContextWebImpl is the JavaBean)
    public static String getBookingResult( String key )
         String myResult = null;
         String myKey = key;
         ApplicationContextWebImpl applicationContextWebImpl = null;
         try
              applicationContextWebImpl = new ApplicationContextWebImpl();
              myResult = (String)applicationContextWebImpl.getValue( key );
         catch ( java.rmi.RemoteException e )
         return myResult;
    }

  • Report and data comming wrong after compress data with full optimization

    In SAP BPC 5.1 version to increase the sysetm performance we did full optimization with compress data.
    Theis process end with error, after login into system the report and values comming wrong,
    What is the wrong,how to rectify it
    Regards
    prakash J

    This issue is resolved,

  • Importing of BPPayment Methods data through DTW

    HI All,
        Please explain how to upload BPPaymentMethods data through DTW?
    Kind Regards
      Silpa.N

    You can use oBPPaymentMethods template to reach your goal through DTW.  There are only 3 column needed to fill the teplate.
    RecordKey is identifier. It has to be unique.
    Linenum is (line number - 1) in that record.
    PaymentMethodCode is Incoming or Outgoing
    Hope this info is helpful to you.
    Thanks,
    Gordon

  • How to pass delivery date through BAPI while creating a sale order

    Dear frndz,
         I am using 'BAPI_SALESORDER_CREATEFROMDAT1'
    to create a sale order .
        I don't have any problem..
        But I have to pass schedule line delivery date through this bapi .
       I used REQ_DATE in structure BAPISCHDL.
       But I can' t get it.
       Through which parameter can i meet this..
       The sale order should be created line item wise along with my delivery date..
      Any suggestions...
    regards.
    siva

    Dear frnd,
        Danq for your response..I can't use DLV_DATE for this requirement..
        But I used REQ_DATE in the structure BAPISCHEDULE .
       I came to know that the problem i faced previously  was only
    internal data conversion.
        Now am able to pass my delivery date..
        so I am closing the thread..
    Regards.
    siva

  • Standard transaction CG02 is not saving the data through F4 help.

    Hi Guru's,
    I have a standard transaction CG02 for this i need to maintain the data for few of the fileds like Heat sensitive, Thermal energy hazard etc for updating Derived C-Alpha code. So for i can able to maintain data in two ways either directly or through F4 help. when i supply data through F4 help, on the F4 help window im providing the value and clicking on continue button. when i press on the contine button it has to update the Derived C-Alpha code if all the required fields filled. If suppose i do nt fill the required fileds it has to through the error message on the status bar like 'Invalid Data Combination'. Now the problem is when i clicking on continue button of F4 help window im displaying error message. when i try to change the value or try to fill the required fields so for im clicking on F4 button then its directly taking me out from the transaction. If i press on any key also samething is happening but the same is not happening when i supplied the value directly and saving. Please let me know the reason and help me for the same with needful solution. If anyting required plz ask me i wil provide u.
                          Here we are using the user-exit EXIT_SAPLC107_001 and is used in FM C107_CUSTOMER_FUNCTION_CALL.
    Thanks in advance.
    Yours,
    Somu.

    Hi Guru's,
    I have a standard transaction CG02 for this i need to maintain the data for few of the fileds like Heat sensitive, Thermal energy hazard etc for updating Derived C-Alpha code. So for i can able to maintain data in two ways either directly or through F4 help. when i supply data through F4 help, on the F4 help window im providing the value and clicking on continue button. when i press on the contine button it has to update the Derived C-Alpha code if all the required fields filled. If suppose i do nt fill the required fileds it has to through the error message on the status bar like 'Invalid Data Combination'. Now the problem is when i clicking on continue button of F4 help window im displaying error message. when i try to change the value or try to fill the required fields so for im clicking on F4 button then its directly taking me out from the transaction. If i press on any key also samething is happening but the same is not happening when i supplied the value directly and saving. Please let me know the reason and help me for the same with needful solution. If anyting required plz ask me i wil provide u.
                          Here we are using the user-exit EXIT_SAPLC107_001 and is used in FM C107_CUSTOMER_FUNCTION_CALL.
    Thanks in advance.
    Yours,
    Somu.

  • Error while Loading data through .csv file

    Hi,
    I am getting below date error when loading data through into Olap tables through .csv file.
    Data stored in .csv is 20071113121100.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
    ... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
    ... nl:ERROR(u:'transformation error')).
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
    TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
    TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
    Any help is greatly appreciated.
    Thanks,
    Poojak

    1) Wrong format, you wont get much support loading OLAP cubes in here I think.
    2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
    *** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
    Expression in Informatica is setup as below.
    IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
    Thanks,
    Poojak

  • SAP BPC 7.5 SP3 and Citrix Entering Data through BPC Client and Citrix

    Hello BPC experts,
    my client is using SAP BPC 7.5 SP3 in combination with Citrix. We are having problems now, while entering data through the BPC Client (installed on BPC Server) and the BPC Client (running on Citrix).
    When we enter data through the BPC Client (Server) and expand the sheet, we can only see the data that we entered through the BPC client (Server). When entering data through the BPC Client (Citrix), we can see only the data we enter through the BPC Client (Citrix).
    The database, however, saves both entries correctly. For test purposes we created a test report which shows both entries correctly in the BPC Client on the server and on Citrix.
    Does anyone have an idea what can be wrong with our system? We tried to create an EvDRE log File for this BPC version as well, but failed. Maybe one of you already did that and can help me with this.
    Thanks in advance!

    You can greatly improve your chance of receiving a helpful answer to your question if you state the version (MS or NW) of BPC which you are using.
    Also notice the sticky [note|Please do not post BPC, SSM or FI/CO questions here!; at the top of this forum whereby we announced new dedicated forums for BPC which are the proper place to post your questions regarding BPC in the future to be able to reach the right audience for your question.
    Thanks and best regards,
    [Jeffrey Holdeman|http://wiki.sdn.sap.com/wiki/display/profile/Jeffrey+Holdeman]
    SAP Labs, LLC
    BusinessObjects Division
    Americas Applications Regional Implementation Group (RIG)

  • Error while importing Item Master data through DTW

    Hello Expert,
      I trying to import item master data through DTW but it gives an error while importing as shown in attach file..
      Please help me...
      I am using SAP 9.0 Pl 6
    Regards,
    Sandy

    Hi Sandy,
    Kindly follow the check list
    1. Right Click DTW and run it as Administrator.
    2. Is your DTW version is same as SAP B1.
    3. Uninstall and re-install DTW.
    4. If you are using 64-bit DTW, try to use 32-bit one.
    5. Check the Template, is it of the same DTW version.
    6. Remove all the unnecessary columns.
    7. Last try different Template extension.. (e.g: CSV (Comma delimited), or Text (Tab delimited))
    Hope you find this helpful......
    Regards,
    Syed Adnan

Maybe you are looking for