Tuxedo Log file data ( using -r switch to log services).

Hi,
I'm trying to convert the SDATE and EDATE parameters in the standard errors file
into dates.
Does anyone out there have any ideas on how to convert the numbers in these columns
into a date using SQL. This would enable me to store data and perform data analysis
in an SQL server environment rather than MS Access...
Any suggestions would be appreciated...
Cheers,
Steve.

I've attached a java program we use to read the SDATE and RDATE values and convert
them into java.util.Date objects, and generate a .csv file for use in excel. You
should be able to modify this to put dates into a database.
Cheers,
Anthony
\"Stephen Loughborough" <[email protected]> wrote:
>
Hi,
I'm trying to convert the SDATE and EDATE parameters in the standard errors
file
into dates.
Does anyone out there have any ideas on how to convert the numbers in these
columns
into a date using SQL. This would enable me to store data and perform data
analysis
in an SQL server environment rather than MS Access...
Any suggestions would be appreciated...
Cheers,
Steve.[svcrpt.tar.gz]

Similar Messages

  • Type of error in the log file while using using call transaction mode u2018Eu2019

    Hi Gurus,
    Please Answer for this qusetion urgently
    what type of error exactly  you will be seeing in the log file while using call transaction mode u2018Eu2019?
    Thanks/
    Radha.

    Hi,
    Can you be clear.
    In call transaction , no error logs  are created, you have to handle the errors explicitly using the structure BDCMSGCOLL.
    Whenever you use E mode then if the transaction encounters any of the errors i.e. data type mismatching or invalid values etc, it will stop at that screen.
    You can handle the errors in call transaction in the following method.
    create a table using the structure BDCMSGCOLL.
    then
    loop at ......
          CALL TRANSACTION 'XK01' USING I_BDCDATA MODE 'N' UPDATE 'S' MESSAGES INTO I_MESGTAB.
    endloop.
      SORT I_MESGTAB BY MSGID MSGV1 ASCENDING.
      DELETE ADJACENT DUPLICATES FROM I_MESGTAB.
      LOOP AT I_MESGTAB.
        CALL FUNCTION 'FORMAT_MESSAGE'
          EXPORTING
            ID   = I_MESGTAB-MSGID
            LANG = I_MESGTAB-MSGSPRA
            NO   = I_MESGTAB-MSGNR
            V1   = I_MESGTAB-MSGV1
            V2   = I_MESGTAB-MSGV2
            V3   = I_MESGTAB-MSGV3
            V4   = I_MESGTAB-MSGV4
          IMPORTING
            MSG  = MESG1.
        IF I_MESGTAB-MSGTYP = 'S' .
          WA_SUCCMESG-MESG = MESG1.
          APPEND WA_SUCCMESG TO I_SUCCMESG.
    else     IF I_MESGTAB-MSGTYP = 'E' .
          WA_ERRMESG-MESG = MESG1.
          APPEND WA_ERRMESG TO I_ERRMESG.
        ENDIF.
      ENDLOOP.
    Hope this is clear.
    Thanks and Regards.

  • How to encrypt excel file data using triple DES algorithm in oracle

    Hi,
    I would like to know the process or script to encrypt/decrypt the excel file data using triple DES algorithm in oracle.

    I'm not quite sure your requirement.... do you mean when uploading files to be stored in the database ?

  • How to store data into log file(.doc) using CVI

    The purpose is to store the capture data(from my cvi program) and store into the log file (.doc). so at the end of the program run I can retrieve the doc file where the data  store to do my analysing .
    Why do this: I wanted to create my program into .exe file that why I need this method or do u have other method to intro?
    Any ideal or note or example to share? Pls help

    If you are explicitly trying to create a Microsoft Word document, then CVI comes with Word Report instrument that can be used to generate such files. The instrument is located in toolslib\activex\word\wordreport.fp, while a sample program can be found in samples\activex\word\wordrpt.cws.
    I suggest you take a look at the example program that illustrates the fundamentals of this instrument, next you can start designing your own application using that instrument.
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • CR2008  Cannot report on IIS Log file data source

    I have CR2008 (SP0) and CRXIR2 installed on a Vista desktop.   I can create reports against IIS Log files using XIR2.   When I attempt to make the data connection using CR2008, I go through the same dialog to select log files and dates but at the end it displays "no items found" and I have no table connection that I can add to the report.
    My primary source is IIS 6 log files on a server using a mapped drive.  I have also tried the same using local IIS 7 on the same Vista pc that Crystal is installed on and neither work using CR2008;  both work fine from CRXIR2.
    I also opened from CR2008 an IIS log report created in CRXIR2 and oddly the report runs from 2008 but I cannot modify the data connection as I again just receive "no items found" if I attempt to establish another connection.
    I've tried a re-install and I've confirmed in Crystal setup that I have installed Web Activity Logs as Data source.

    Hi there,
    Try the following:
    1.  Connect to directory of where your IIS server that contains the log file.  It should be in C:\Windows\system32\LogFiles\W3SVC1
    (Or alternatively, I should suggest copying one of these files over locally to your workstation to test.  The file should be in the format of ex*.log)
    2.  Open up CR2008, and do a "Create New Connection"
    3.  Then choose "More Data Sources" -> "MS IIS/Proxy Log Files"
    4.  Point to where your logfile is, whether locally or remotely on the server. 
    5.  A "Select Log Files and Dates" window should appear
    6.  Under "Enter Log File Format and Location" panel (at the top), choose "Extend (ex*.log)" format
    7.  Browse to the file that you would to report off.

  • Log file creation using km api

    Hi,
    how to create log file using km api . please provide me if any sample code available.
    Thanks and Regards,
    Nari.

    Thanks for your quick reply but one more requirement is... here i can able create text file in km and adding content to created text file on the same line but i want to update new content in next line(newline).Please see below code and correct it.
         Date dt = new Date(Calendar.getInstance().getTimeInMillis());
                                  com.sapportals.portal.security.usermanagement.IUser iuser = WPUMFactory.getServiceUserFactory().getServiceUser("cmadmin_service");
                                  IResourceContext irCtx = new ResourceContext(iuser);
                                  RID docsResource = RID.getRID(filepath);
                                  IContent initCont = new Content(new ByteArrayInputStream("".getBytes()),"text/plain",-1,null);
                                  if(ResourceFactory.getInstance().getResource(RID.getRID(filepath+"/"+filename), irCtx) == null)
                                       ICollection docsColl = (ICollection)com.sapportals.wcm.repository.ResourceFactory.getInstance().getResource(docsResource,irCtx);
                                       docsColl.createResource(filename,null,initCont);
                              String InputData = Exception;
                              RID sugg_html = RID.getRID(filepath+"/"+filename);
                              IResource resource = com.sapportals.wcm.repository.ResourceFactory.getInstance().getResource(sugg_html,irCtx);
                              String existingComments;
                              IContent cont = resource.getContent();
                              BufferedReader buf_in = new BufferedReader(new InputStreamReader(cont.getInputStream()));
                              existingComments = buf_in.readLine();
                              existingComments = existingComments+"   "+"\n"+dt+InputData;
                              ByteArrayInputStream inputStream = new ByteArrayInputStream(existingComments.getBytes());
                              cont = new Content(inputStream,"text/plain",-1,null);
                              resource.updateContent(cont);
                              cont.close();

  • Problem specifying SQL Loader Log file destination using EM

    Good evening,
    I am following the example given in the 2 Day DBA document chapter 8 section 16.
    In step 5 of 7, EM does not allow me to specify the destination of the SQL Loader log file to be on a mapped network drive.
    The question: Does SQL Loader have a limitation that I am not aware of, that prevents placing the log file on a network share or am I getting this error because of something else I am inadvertently doing wrong ?
    Note: I have placed the DDL, load file data and steps I follow in EM at the bottom of this post to facilitate reproducing the problem *(drive Z is a mapped drive)*.
    Thank you for your help,
    John.
    DDL (generated using SQL developer, you may want to change the space allocated to be less)
    CREATE TABLE "NICK"."PURCHASE_ORDERS"
        "PO_NUMBER"      NUMBER NOT NULL ENABLE,
        "PO_DESCRIPTION" VARCHAR2(200 BYTE),
        "PO_DATE" DATE NOT NULL ENABLE,
        "PO_VENDOR" NUMBER NOT NULL ENABLE,
        "PO_DATE_RECEIVED" DATE,
        PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      SEGMENT CREATION DEFERRED PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING STORAGE
        INITIAL 67108864
      TABLESPACE "USERS" ;
    Load.dat file contents
    1, Office Equipment, 25-MAY-2006, 1201, 13-JUN-2006
    2, Computer System, 18-JUN-2006, 1201, 27-JUN-2006
    3, Travel Expense, 26-JUN-2006, 1340, 11-JUL-2006
    Steps I am carrying out in EM
    log in, select data movement -> Load Data from User Files
    Automatically generate control file
    (enter host credentials that work on your machine)
    continue
    Step 1 of 7 ->
      Data file is located on your browser machine
      "Z:\Documentation\Oracle\2DayDBA\Scripts\Load.dat"
       click next
    step 2 of 7 ->
      Table Name
      nick.purchase_orders
      click next
    step 3 of 7 ->
      click next
    step 4 of 7 ->
      click next
    step 5 of 7 ->
      Generate log file where logging information is to be stored
      Z:\Documentation\Oracle\2DayDBA\Scripts\Load.LOG
      Validation Error
      Examine and correct the following errors, then retry the operation:
      LogFile - The directory does not exist.

    Hi John,
    But, i did'nt found any error when i am going the same what you did.
    My Oracle Version is 10.2.0.1 and using Windows xp. See what i did and i got worked
    1.I created one table in scott schema :
    SCOTT@orcl> CREATE TABLE "PURCHASE_ORDERS"
      2  (
      3      "PO_NUMBER"      NUMBER NOT NULL ENABLE,
      4      "PO_DESCRIPTION" VARCHAR2(200 BYTE),
      5      "PO_DATE" DATE NOT NULL ENABLE,
      6      "PO_VENDOR" NUMBER NOT NULL ENABLE,
      7      "PO_DATE_RECEIVED" DATE,
      8      PRIMARY KEY ("PO_NUMBER") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS NOCOMPRESS LOGGING TABLESPACE "USERS" ENABLE
      9  )
    10  TABLESPACE "USERS";
    Table created.I logged into em Maintenance-->Data Movement-->Load Data from User Files-->My Host Credentials
    Here i total 3 text boxes :
    1.Server Data File : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\USERS01.DBF
    2.Data File is Located on Your Browser Machine : z:\load.dat <--- Here z:\ means other machine's shared doc folder; and i selected this option (as option button click) and i created the same load.dat as you mentioned.
    3.Temporary File Location : C:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\ <--- I did'nt mentioned anything.
    Step 2 of 7 Table Name : scott.PURCHASE_ORDERS
    Step 3 of 7 I just clicked Next
    Step 4 of 7 I just clicked Next
    Step 5 of 7 I just clicked Next
    Step 6 of 7 I just clicked Next
    Step 7 of 7 Here it is Control File Contents:
    LOAD DATA
    APPEND
    INTO TABLE scott.PURCHASE_ORDERS
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    PO_NUMBER INTEGER EXTERNAL,
    PO_DESCRIPTION CHAR,
    PO_DATE DATE,
    PO_VENDOR INTEGER EXTERNAL,
    PO_DATE_RECEIVED DATE
    And i just clicked on submit job.
    Now i got all 3 rows in purchase_orders :
    SCOTT@orcl> select count(*) from purchase_orders;
      COUNT(*)
             3So, there is no bug, it worked and please retry if you get any error/issue.
    HTH
    Girish Sharma

  • How to easily export/download "Debugging & Logging Log Files" data?

    In the Debugging & Logging > Log Files screen/page, are there workarounds for easily export/download the data that's listed in that screen? I do not want to download the log. Basically, all I want is the table that shows the File Name, Type Size, and Last Modified information.
    Thank you,
    Charlie

    Not really, but if you copy and paste the table and stick it in ms word or something it will retain its layout, you can then just save it, or just take a screenshot. It all depends on what you planning on using the export for. Why do you need that information specifically, this will depend on the way to do it. If its something you want often and automatically then you can just generate the information yourself using some Coldfusion and the logs directory.

  • Delay in event driven log file data writing? Please help!!!

    System Information
    Operating System: XP
    Labview: 8.2
    Force sensor data acquisition via DAQPad-6070E
    Actuator: Actuator via MCS-3D controller
    Programming Information
    Number of events: 13
    Position read: Reads the position and the force sensor data every second.
    Move I &  Forward:  Moves the actuator forward with a define step size
    Two actuators are made to travel certain distance. A force sensor is attached to the system. The aim here is to acquire continuous data as per the defined time wait (1 sec). The data is logged in a text file which gives the position travelled from the actuator, the force sensor data with a time stamp.
    The issues I am encountering is during writing a file.
    For ex: When even is activated ( Move actuator at defined stepsize) the event are logged into the log file but the positions are updated into the log file only when the next event is activated. So it means that the positions and the force values are updated into the logfile after the consecutive event is executed. If you see the logfile ex inside the attachment the red block explains the event executed but the position are updated in the next line (event). This file is just for example.
    Please help here I am going wrong!
    Thanks in advanced
    Attachments:
    EventMoveex.PNG ‏582 KB
    Logfile.PNG ‏64 KB

    Dear Method M
    I find it out what was going on. As you mentioned that I was writing the values before the actuators achieving the final position, so I introduced a delay between the execution of two SubVI's. It isnt a clean method but it works.
    thank you very much!
    Regards
    Itz

  • The log file behavior does not follow the logging preferences I set

    I set my log file parameters to capture a large amount of information.
    Specifically, I wanted to capture log files as big as 1GB and keep them
    for 3 sets of backups. The settings I used are as follows:
    <P>
    logfile.http.maxlogfilesize 1073741824
    logfile.http.maxlogsize 4294967296
    <P>
    However, after setting these values, I can see only two log files, the file
    for today and the file for yesterday.
    (See attachment)

    I've given full read and write privileges
    To whom? And as whom are you connecting?

  • Oc4j log file: many java.lan.NullPointerException in log file

    I use oc4j 10.1.3 Developer Preview 4
    I found many java.lan.NullPointerException in log file.
    I don't know if this is normal or not, but it makes me a little suspicious
    PEC Barnes

    As an example :
    <MESSAGE>
    <HEADER>
    <TSTZ_ORIGINATING>2005-12-14T11:26:19.330+01:00</TSTZ_ORIGINATING>
    <COMPONENT_ID>oracle</COMPONENT_ID>
    <MSG_TYPE TYPE="WARNING"></MSG_TYPE>
    <MSG_LEVEL>1</MSG_LEVEL>
    <HOST_ID>yale.domain.com</HOST_ID>
    <HOST_NWADDR>129.233.33.12</HOST_NWADDR>
    <THREAD_ID>11</THREAD_ID>
    <USER_ID>barnes</USER_ID>
    </HEADER>
    <CORRELATION_DATA>
    <EXEC_CONTEXT_ID><UNIQUE_ID>129.233.33.12:6613:1134555978780:1</UNIQUE_ID><SEQ>0</SEQ></EXEC_CONTEXT_ID>
    </CORRELATION_DATA>
    <PAYLOAD>
    <MSG_TEXT>Caught exception: java.lang.NullPointerException.</MSG_TEXT>
    <SUPPL_DETAIL><![CDATA[java.lang.NullPointerException
         at oracle.oc4j.admin.management.mbeans.J2EELogging.setLoggerLevel(J2EELogging.java:282)
         at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at javax.management.modelmbean.RequiredModelMBean.invoke(RequiredModelMBean.java:1079)
         at oracle.oc4j.admin.jmx.server.mbeans.model.DefaultModelMBeanImpl.invoke(DefaultModelMBeanImpl.java:604)
         at com.sun.jmx.mbeanserver.DynamicMetaDataImpl.invoke(DynamicMetaDataImpl.java:221)
         at com.sun.jmx.mbeanserver.MetaDataImpl.invoke(MetaDataImpl.java:228)
         at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:822)
         at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:792)
         at oracle.oc4j.admin.jmx.ejb.MBeanServerEjbBean.invoke(MBeanServerEjbBean.java:343)
         at oracle.oc4j.admin.jmx.ejb.MBeanServerEjbBean.invoke(MBeanServerEjbBean.java:310)
         at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.evermind.server.ejb.interceptor.EJBJoinPointImpl.invoke(EJBJoinPointImpl.java:39)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:45)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:62)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor$1.run(JAASInterceptor.java:32)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:379)
         at com.evermind.server.ThreadState.runAs(ThreadState.java:637)
         at com.evermind.server.ejb.interceptor.system.JAASInterceptor.invoke(JAASInterceptor.java:36)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.interceptor.system.TxSupportsInterceptor.invoke(TxSupportsInterceptor.java:37)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.interceptor.system.SecurityRoleInterceptor.invoke(SecurityRoleInterceptor.java:46)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.interceptor.system.DMSInterceptor.invoke(DMSInterceptor.java:62)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.interceptor.system.RunningStateInterceptor.invoke(RunningStateInterceptor.java:28)
         at com.evermind.server.ejb.interceptor.InvocationContextImpl.proceed(InvocationContextImpl.java:43)
         at com.evermind.server.ejb.StatefulSessionEJBObject.OC4J_invokeMethod(StatefulSessionEJBObject.java:840)
         at MBeanServerEjbRemote_StatefulSessionBeanWrapper0.invoke(MBeanServerEjbRemote_StatefulSessionBeanWrapper0.java:50)
         at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.evermind.server.rmi.ServerRmiMessageHandler.doMethodCall(ServerRmiMessageHandler.java:560)
         at com.evermind.server.rmi.ServerRmiMessageHandler.handleMethodInvocation(ServerRmiMessageHandler.java:471)
         at com.evermind.server.rmi.ServerRmiMessageHandler.handleOrmiRequest(ServerRmiMessageHandler.java:262)
         at com.evermind.server.rmi.ServerRmiMessageHandler.dispatchRequest(ServerRmiMessageHandler.java:231)
         at com.evermind.server.rmi.RMIServerConnection.processReceivedCommand(RMIServerConnection.java:155)
         at com.evermind.server.rmi.RMIConnection.handleCommand(RMIConnection.java:151)
         at com.evermind.server.rmi.RMIConnection.listenForOrmiCommands(RMIConnection.java:126)
         at com.evermind.server.rmi.RMIConnection.run(RMIConnection.java:105)
         at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:298)
         at java.lang.Thread.run(Thread.java:534)
    ]]></SUPPL_DETAIL>
    </PAYLOAD>
    </MESSAGE>

  • Sending Table data using  RFC as a Web services

    Hi guys,
      I want to send ztable data from CRM system using RFC as a web service , but i want to read records only for a particular date and time , so how can i go about this scenario.
    regards;Keith.

    Hi Keith,
    for that scenario you need two RFC adapter. Of course you can use as well proxies as dicussed before. Forget about webservices.
    You told me above that you have z-table? add a date/time field and if a set will be inserted fill those fields with sy-datum / sy-uzeit.
    Your CRM select can now easy select only the actual data.
    Because of performance i would recommed a asynchron scenario, for example:
    A ABAP programm is selecting the data and is calling a module like
    CALL function 'myFunction'
    DESTINATION 'mySM59'
    IN BACKGROUND TASK
    This function module is build anywhere and imported into Repository and is working a outbound interface. Inbound interface is a second function module, which is imported from R/3. The ABAP source code of that module put the data to SAP system.
    The CRM ABAP programm will be called periodicly by a job.
    But!! I dont know ur requirements. Unfortunately you are one of thesse persons, who are asking without giving details. I'm not the prophet...
    Regards,
    Udo

  • Loading FLat file data using FDMEE having 1 to many mapping

    Hi All,
    I need to load a data from Flat file to hyperion planning applcation using FDMEE having one to many mapping
    For e.g Data file has 2 records
    Acc Actual Version1 Scene1 1000
    Acc Actual Version1 Scene2 2000
    now target application has 5 dimension and data need to be load as
    acc Actual Version1 entity1 Prod2 1000
    Acc Actual Version1 Entity2 Prod2 2000
    Please suggest
    Regards
    Anubhav

    From your exmple I don't see the one too many mapping requirement. You have one source data line that maps to a single target intersection. Where is the one to many mapping requirement in your example?

  • Persisting raw file data using XMLEncoder

    Guys
    I am trying to persist the 'Buffer' object which has a 'fileName' property. I dont want my object to be dependent on the machine containing the file 'fileName'. So I want to persist the data for 'fileName' file when XMLEncoder is used to persist 'Buffer' object. So I extended DefaultPersistenceDelegate. However I get OutOfMemoryError inside the initialize method. It seems that even though I am setting byTmp array to null after out.writeStatement the array byTmp is not garbage collected. Apparently the encoder is holding a reference to the byTmp array. Also note that the file is huge and cannot be loaded into a single byte array completely.
    Can anyone think of a work around for this problem?
    Thanks
    Rahul
    public class Buffer() {
    private String fileName;
    //These methods will be invoked by the XMLDecoder
    public void startWritingBytes(String fileName) {
    //open tmp file for writing
    public void writeBytes(byte[] b) {
    //write to tmp file
    public void doneWritingBytes() {
    //close file.
    //uncompress tmp file and write to file 'fileName'
    public static class BufferPersistenceDelegate extends
    DefaultPersistenceDelegate {
    public UnlimitedBufferPersistenceDelegate() {
    super();
    protected void initialize(Class<? > type, Object oldInstance, Object
    newInstance, Encoder out) {
    // Note, the "fileName" property will be set here.
    super.initialize(type, oldInstance, newInstance, out);
    try {
    //First compress the file.
    //Persist compressed data
    try {
    java.beans.Statement sm = null;
    sm = new java.beans.Statement(oldInstance, "startWritingBytes",
    new Object[] {fileName});
    out.writeStatement(sm);
    byte[] byTmp = null;
    byte b = 0;
    //Get OutOfMemoryError here
    while ( (bytesRead = fin.read(buffer)) > 0) {
    byTmp = new byte[bytesRead];
    System.arraycopy(buffer, 0, byTmp, 0, bytesRead);
    sm = new java.beans.Statement(oldInstance, "writeBytes",
    new Object[] {byTmp});
    out.writeStatement(sm);
    byTmp = null;
    sm = new java.beans.Statement(oldInstance, "doneWritingBytes",
    null);
    out.writeStatement(sm);
    finally {
    //close compressed file
    //delete compressed file
    catch (Throwable ex) {       
    }

    Sorry guys, I think I had the answer to the question in the question itself!!
    I was not freeing the statement object. The problem disappears when I set sm to null.

  • InfoPath 2013 Read SharePoint 2013 File data using Rest API Access Denied Exception

    I am designing a set of Forms and they need to query Data from among themselves.
    The whole set up described below works in the Form Filler/Preview
    I'll call them Form A and Form B
    Form A has a repeating table that needs to be displayed in Form B
    The user selects from a DropDown in Form B an Instance of Form A, using the selected I REST connection is executed so the Form A xml is available inside Form B. The connection is set up as follows:
    _api/web/lists/ListName/Items(SelectedId)/File/$value
    I publish the form as site content type, add it to a library, after triggering the REST connection I get an error. ULS gives me a 401 Access denied for NT Authority\IUSR (as it should since I don't have anonymous access enabled [nor has that solved the issue])
    That's my issue. All requests on the REST api are being executed as anonymous and not as a user that should have permission.
    Things I've tried:
    1. The connection uses a UDCX file, the conenction is set to use the form server proxy. The proxy has been enabled for the Form Services, web application and user connection. I've tried it with a configured App ID or an Explicit account
    2. I've tried enabling Anonymous access, but have had no success
    3. I've gotten the Query to work on Post Backs by adding the following to the web.config:
    <location path="_layouts/15/Postback.FormServer.aspx">
        <system.web>
          <identity impersonate="false" userName="bhs\sp_admin_dev" password="M1crosoft" />
        </system.web>
      </location>
    And while it solves the issue for Postback requests and I could add FormServer.aspx to the list I can't use this solution for a production environment, nor can I predict other issues that could be caused by the change.
    I haven't been able to find any references to this error so I wonder if I'm doing something wrong or if there's another way to do this.
    If I've been unclear on anything, let me know and I'll try to clear it up.

    Hi Choggo,
    thank you for your information,
    regarding this issue, it seems we may need to debug and trace your network, to check if should the parameter that is used for the REST connection is correct.
    i checked with infopath team members regarding this issue, they suggest that you try with impersonation, so that the user that login is not anonymous, but the user that you already been assign with.
    the last suggestions from our sharepoint team members that we are able to do, as we have limited tools on this forum support, that you need to check the file udcx itself, do the permission to access that file is correct, so for example, if the file is not
    having the permission to be read/access then the system may result with anonymous account, so that we may have the result that the data that should be passed are able to accessed.
    if should this suggestion not applicable to your environment, our sharepoint team members suggest that you to open an incident ticket, so that we can check and re-confirm more deep for you if should this is an undocumented feature or not.  the action
    plans is to have a remote session, then we can trace the data passing process, that is already correct, so that the IUSR is not appear when it authenticate.
    http://support.microsoft.com/contactus/?wa=wsignin1.0
    Regards,
    Aries
    Microsoft Online Community Support
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

Maybe you are looking for

  • How to detect malware from the flashlight on iPhone 5?

    I have the factory installed flashlight on my iPhone 5.  I purchased it from the Verizon store.  How can I detect if it really malware?

  • Changing  the WarnAboutChanges message?

    dear all, i want to know whether we can change the text message of the WarnAboutChanges pop up or not. i.e this message -: " The Changes made to the page have not been saved. If you continue the changes will be discarded.Do you wish to continue?? " a

  • Updating camera raw 7.2

    When I go to send a picture from LR4 to PS5 it says I need to update to camera raw 7.2  I did this but it keeps coming up telling me to do this?  Why is this happening?

  • Closing the iBook when using S Video

    I was told, at the apple store, that if I bought an iBook and the Video Adapter, I could hook my ibook up to my high def tv. I recieved my adapter and computer yesterday and hooked it up to my 51 inch TV. I was pleased that I was able to get a pretty

  • Photoshop Open dialog does not show all files

    I've been noticing that unconsitantly the Photoshop Open dialog does not show all PSD files in a folder, some are just plain missing. In Finder they show up like normal though. Anyone else with this problem?