Timestamps overwritte​n, Error 1074115594

Hey everyone,
I'm back with another enigmatic error.    My 5152 is currently set to collect and store a large number of waveforms, then fetch them only when enough are collected to fill a large 2D array in LabVIEW. 
At the moment I am filling a 750x220 array -> 165,000 entries.  After the first fetch,  I am receiving Error -1074115594 which is along the lines of "Timestamps have been overwritten, data cannot be read". 
In talking with NI support,  the cause of it stems from the digitzier:  the number of timestamps (which are collected for every acquisition) are limited at 100,000.  After 100k entries, additional timestamps overwrite the old ones.  This is not a common error, though, and they have little additional information.
I'm not sure if this is a limitation of the 515x series or my 5152, but the hard 100k limit makes it seem that the timestamps are stored separately from the waveforms. I was hoping that someone could help shed light on this issue:
1.  Are the acquisiton timestamps stored separately from the waveforms?
2. Where/how does this occur?
3.  Is there any work-around? (property nodes, disabling timestamps, etc)?
The stored waveforms are only 133 bytes each, so 133 bytes x 165,000 waveforms = 21MB.  My scope has 128MB onboard memory, so I should not be in a danger zone for overflowing my onboard acquisition memory.
What makes this frustrating is that I have been successful in collecting this volume of data before.  However, I have since started to "clean up" my code, and....we know how that goes.  Any ideas how I may have been able to store and fetch this much before?
I'm hoping someone can chime in and offer some more insight as to the root of the issue and whether there is anything I can do to get around this.
Many thanks,
jm

Hi JM,
The timestamp data is in fact buffered separately from the records and 100k is the limit for the number of "unfetched" records to reside on your digitizer, even if you have sufficient memory as the records are small.  This information is documented in the "Waveform Specification" information in your NI-5152 Device Specifications.
The good news is that the error is very easy to work around.  In the case of 165k records, you could for example make two fetch calls, each requesting 82.5k records. The properties of interest are "number of records to fetch" and "fetch record number".  You would set the first parameter to 82.5k, and the second would increment after each fetch (i.e. initially be 0, then increment to 82.5k).
This should avoid the timestamps overwritten error you are seeing.  If not, please share some information on the trigger rate you are testing with.
- Jennifer O.

Similar Messages

  • Ejb datacontrol, query panel with timestamps / date field errors

    Hi,
    I made an ejb datacontrol on a session bean in jdev 11g ps1 and used the named criteria of this entity in the data control to create an af querypanel. This works well.
    first thing I cannot configure a date picker with time on this timestamp field (only date ).( does not matter what I configure in the entity datacontrol xml , it does not work )
    and displaying the timestamp field in a inputData ( result table ) and showing the time also does not work either.
    When I use in the query panel a between query operator on this date or timestamp field I get this error.
    Caused by: java.lang.IllegalArgumentException: An exception occurred while creating a query in EntityManager:
    Exception Description: Syntax error parsing the query [SELECT COUNT(o) FROM RunMessages o WHERE (o.processDate BETWEEN '2009-12-11' AND '2009-12-12')], line 1, column 56: syntax error at [BETWEEN].
    Internal Exception: MismatchedTokenException(11!=82)
         at org.eclipse.persistence.internal.jpa.EntityManagerImpl.createQuery(EntityManagerImpl.java:1241)
    <BeanDataCollection><invokeMethod> Exception occurred invoking $Proxy179.queryByRange
    java.lang.reflect.InvocationTargetException
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at oracle.adf.model.adapter.bean.provider.BeanDataCollection.invokeMethod(BeanDataCollection.java:405)
         at oracle.adf.model.adapter.bean.jpa.JPQLBeanDataCollection.getRecordCount(JPQLBeanDataCollection.java:164)
         at oracle.adf.model.adapter.bean.provider.BeanDataCollection.init(BeanDataCollection.java:153)
         at oracle.adf.model.adapter.bean.jpa.JPQLBeanDataCollection.init(JPQLBeanDataCollection.java:110)
         at oracle.adf.model.adapter.bean.provider.BeanDataCollection.refresh(BeanDataCollection.java:380)
         at oracle.adf.model.adapter.bean.provider.BeanDataProvider.getDataProvider(BeanDataProvider.java:63)
         at oracle.adf.model.adapter.bean.DataFilterHandler.invokeAccessor(DataFilterHandler.java:137)
         at oracle.adf.model.adapter.bean.BeanFilterableDataControl.invokeAccessor(BeanFilterableDataControl.java:78)
         at oracle.adf.model.bean.DCBeanDataControl.invokeAccessor(DCBeanDataControl.java:447)
         at oracle.adf.model.bean.DCDataVO$DCAccessorCollectionAdapter.getDataProvider(DCDataVO.java:2627)
         at oracle.adf.model.bean.DCDataVO$DCAccessorCollectionAdapter.refreshIterator(DCDataVO.java:2519)
         at oracle.adf.model.bean.DCDataVO.executeQueryForCollection(DCDataVO.java:419)
         at oracle.jbo.server.ViewRowSetImpl.execute(ViewRowSetImpl.java:1130)
         at oracle.jbo.server.ViewRowSetImpl.executeQueryForMasters(ViewRowSetImpl.java:1299)
         at oracle.jbo.server.ViewRowSetImpl.executeQueryForMode(ViewRowSetImpl.java:1217)
         at oracle.jbo.server.ViewRowSetImpl.executeQuery(ViewRowSetImpl.java:1211)
         at oracle.jbo.server.ViewObjectImpl.executeQuery(ViewObjectImpl.java:6097)
         at oracle.adf.model.bean.DCBeanDataControl.executeIteratorBinding(DCBeanDataControl.java:943)
         at oracle.adf.model.binding.DCIteratorBinding.doExecuteQuery(DCIteratorBinding.java:2147)
         at oracle.jbo.uicli.binding.MyIteratorBinding.executeQuery(JUAccessorIteratorDef.java:717)
         at oracle.jbo.uicli.binding.JUSearchBindingCustomizer.applyAndExecuteViewCriteria(JUSearchBindingCustomizer.java:598)
         at oracle.adfinternal.view.faces.model.binding.FacesCtrlSearchBinding.processQuery(FacesCtrlSearchBinding.java:424)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at com.sun.el.parser.AstValue.invoke(AstValue.java:157)
         at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:283)
         at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcastToMethodExpression(UIXComponentBase.java:1289)
         at oracle.adf.view.rich.component.UIXQuery.broadcast(UIXQuery.java:115)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.broadcastEvents(LifecycleImpl.java:812)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:292)
         at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:177)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:265)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:191)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:97)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:247)
         at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:157)
         at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:326)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3592)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2202)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2108)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1432)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    Caused by: javax.ejb.EJBException: EJB Exception: ; nested exception is:
         java.lang.IllegalArgumentException: An exception occurred while creating a query in EntityManager:
    Exception Description: Syntax error parsing the query [SELECT COUNT(o) FROM RunMessages o WHERE (o.processDate BETWEEN '2009-12-11' AND '2009-12-12')], line 1, column 56: syntax error at [BETWEEN].
    Internal Exception: MismatchedTokenException(11!=82); nested exception is: java.lang.IllegalArgumentException: An exception occurred while creating a query in EntityManager:
    Exception Description: Syntax error parsing the query [SELECT COUNT(o) FROM RunMessages o WHERE (o.processDate BETWEEN '2009-12-11' AND '2009-12-12')], line 1, column 56: syntax error at [BETWEEN].
    Internal Exception: MismatchedTokenException(11!=82)
         at weblogic.ejb.container.internal.RemoteBusinessIntfProxy.unwrapRemoteException(RemoteBusinessIntfProxy.java:109)
         at weblogic.ejb.container.internal.RemoteBusinessIntfProxy.invoke(RemoteBusinessIntfProxy.java:91)
         at $Proxy179.queryByRange(Unknown Source)
         ... 65 more
    Caused by: java.lang.IllegalArgumentException: An exception occurred while creating a query in EntityManager:
    Exception Description: Syntax error parsing the query [SELECT COUNT(o) FROM RunMessages o WHERE (o.processDate BETWEEN '2009-12-11' AND '2009-12-12')], line 1, column 56: syntax error at [BETWEEN].
    Internal Exception: MismatchedTokenException(11!=82)
         at org.eclipse.persistence.internal.jpa.EntityManagerImpl.createQuery(EntityManagerImpl.java:1241)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at weblogic.deployment.BasePersistenceContextProxyImpl.invoke(BasePersistenceContextProxyImpl.java:93)
         at weblogic.deployment.TransactionalEntityManagerProxyImpl.invoke(TransactionalEntityManagerProxyImpl.java:91)
         at weblogic.deployment.BasePersistenceContextProxyImpl.invoke(BasePersistenceContextProxyImpl.java:80)
         at weblogic.deployment.TransactionalEntityManagerProxyImpl.invoke(TransactionalEntityManagerProxyImpl.java:26)
         at $Proxy175.createQuery(Unknown Source)
         at nl.tennet.mhs.console.model.services.MhsConsoleBean.queryByRange(MhsConsoleBean.java:32)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at com.bea.core.repackaged.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:310)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
         at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
         at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
         at com.bea.core.repackaged.springframework.jee.spi.MethodInvocationVisitorImpl.visit(MethodInvocationVisitorImpl.java:37)
         at weblogic.ejb.container.injection.EnvironmentInterceptorCallbackImpl.callback(EnvironmentInterceptorCallbackImpl.java:55)
         at com.bea.core.repackaged.springframework.jee.spi.EnvironmentInterceptor.invoke(EnvironmentInterceptor.java:50)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
         at com.bea.core.repackaged.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:89)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
         at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
         at com.bea.core.repackaged.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
         at com.bea.core.repackaged.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
         at com.bea.core.repackaged.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
         at $Proxy181.queryByRange(Unknown Source)
         at nl.tennet.mhs.console.model.services.MhsConsole_ssug8i_MhsConsoleImpl.queryByRange(MhsConsole_ssug8i_MhsConsoleImpl.java:218)
         at nl.tennet.mhs.console.model.services.MhsConsole_ssug8i_MhsConsoleImpl_WLSkel.invoke(Unknown Source)
         at weblogic.rmi.internal.ServerRequest.sendReceive(ServerRequest.java:174)
         at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:345)
         at weblogic.rmi.cluster.ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:259)
         at nl.tennet.mhs.console.model.services.MhsConsole_ssug8i_MhsConsoleImpl_1032_WLStub.queryByRange(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:597)
         at weblogic.ejb.container.internal.RemoteBusinessIntfProxy.invoke(RemoteBusinessIntfProxy.java:73)
         ... 66 more

    what happens if you do the following on all three different environments:
    SQL> select to_char(exp_date, 'dd-mon-yyyy hh24:mi:ss') from your_table where your_condition ;
    [pre]

  • Oracle/java timestamp data type error.. please help

    Im having trouble with a SQL query of mine. Im trying to get a timestamp using JDBC from an oracle database however i get the following error:
    java.sql.SQLException: ORA-00932: inconsistent datatypes: expected %s got %s
    Now when i put the java.util.timestamp into the DB it appears to change.
    When I printout the value of the timestamp in my console it prints the following:
    07-06-26 17:28:09.414
    When i check the value from SQL PLUS it gives me this value
    07-06-26 17:28:09,000000
    Does anyone have any idea why this happens? and more importantly how I can retrieve the timestamp again.
    Im using BEA workshop which means i dont have any means of manipulating the data before presenting to the frontend.
    The database is Oracle 9.2.0.1

    user582245,
    Please provide the following:
    1. Entire error message and stack trace you are getting.
    2. Part of your code where the error occurs.
    3. JDK version you are using.
    4. Oracle data-type of the problematic column.
    Good Luck,
    Avi.

  • Timestamp/Date format error with Java 1.6

    I'm getting this error trying to getObjects from a ResultSet query for an Oracle Lite 10G table that has colums of the TIMESTAMP or DATE type. This works fine under java 1.5. Java 1.6 seems to have broken TIMESTAMP/DATE compatibility with Oracle Lite. Are there any plans to make 10G compatible with Java 1.6? We would like to port our application from Java 1.5 to 1.6, but this is an obstacle. I suppose one work-around would be to use TO_CHAR on all the DATE fields and convert them back to java Dates programatically, but that would be a hassle.
    Update: I changed the column types of the table from TIMESTAMP to DATE. The same exception occurs when calling POLJDBCResultSet.getObject() on the DATE columns. Again, this works fine under Java 1.5, but not 1.6.
    java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
         at java.sql.Timestamp.valueOf(Timestamp.java:194)
         at oracle.lite.poljdbc.LiteEmbResultSet.jniGetDataTimestamp(Native Method)
         at oracle.lite.poljdbc.LiteEmbResultSet.getVal(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getTimestamp(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getObject(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getObject(Unknown Source)

    I just found a pretty easy java work-around for this that doesn't involve changing the table column types or the SQL:
    Check the column type from the ResultSetMetaData before calling ResultSet.getObject(). If the type is DATE, TIMESTAMP or TIME, call ResultSet.getString() which doesn't throw an exception. Then convert the string to a java.util.Date using java.text.SimpleDateFormat. That can then be used to instantiate a Timestamp.
    This seems to work.
    Message was edited by:
    user490596

  • Continous period, overwritte​n error

    I am working for a project to counting every photon for about 10 seconds.
    I used the pci-6602 and a computer with a 2.8Ghz Core i5-2300 CPU and a 16GB RAM.
    The software is Measure studio 2010 SP1 and NI DAQmx 9.5.
    But there will be an error:
    Data was overwritten before it could be read by the system.
    If Data Transfer Mechanism is Interrupts, try using DMA. Otherwise, divide the input signal before taking the measurement.
    For average 300kHz input signal, the error appeared immediately, but for average 1kHz input signal, the error still appeared after 3-5 seconds.
    Here is my setting:
    myTask.CIChannels.CreatePeriodChannel(counterCombo​Box.Text, "",
                        Convert.ToDouble(minValueTextBox.Text),
                        Convert.ToDouble(maxValueTextBox.Text), edge,
                        CIPeriodMeasurementMethod.LowFrequencyOneCounter, 0.001, 4, CIPeriodUnits.Seconds);
            myTask.Timing.ConfigureImplicit(SampleQuantityMod​e.ContinuousSamples, 100000);
            myTask.CIChannels.All.DataTransferMechanism = CIDataTransferMechanism.Dma;
            myTask.CIChannels.All.DuplicateCountPrevention = false;
            myTask.Stream.ReadWaitMode = ReadWaitMode.Poll;
            myTask.Stream.ConfigureInputBuffer(1000000);
    I tried many solutions and changed many setting, but the result didn’t changed.
    I heard that the on board memory of pci6602 is small, but for 1kHz signal it still can’t work.
    Please help me to solve this problem.
    Thanks very much.

    1. The basic idea is that instead of measuring the period between each individual photon detector pulse, you simply count the # of pulses occurring within a fixed interval.  The fixed interval would be a constant rate sample clock and you could use another counter to create it.  You may need to look into the "Duplicate Count Prevention" property because you will want to capture the many intervals in which 0 pulses occur.
    2. Impossible for me to judge what resolution or format you need for your data.  In general, count-binning will give you a measurement of average photon activity per sampling interval.  If you get 10 counts in 1 msec, you'll only be able to see that as an *average* of 10 kHz.  Reality may be 10 pulses at > 100 kHz and a lot of idle time.
      Freq or Period measurement on individual pulses gives you much more information about photon activity and you can also do some binning-like averaging after the fact.  But the occasional fast bursts may cause a data acq error.  The high precision trades off against the chance of losing all the data due to a data acq error.
    If it was me, I'd lobby for a more suitable data acq board like an X-series and measure individual pulses.  But it may be that your particular experiments & studies don't demand that precision, in which case binning is a method that should work with the data acq board you've already got.
    -Kevin P

  • Timestamp column loading error

    hi ,
    I am 1 table as below.               
    CREATE TABLE DBMG_SAL_HEAD (
    CM_DATE DATE,
    CM_TIME VARCHAR2(8),
    data in table is..
    CM_DATE     CM_TIME
    10/20/2010 13:09:06
    now I need to insert above data in below table with 1 second less time ex-10/20/2010 13:09:05
    CREATE TABLE PC (
    COL1 TIMESTAMP(9)
    how I can do it??
    I tried below query but when I entered in table its not showing correct format..I m using oracle 11g.
    select TO_DATE( TO_CHAR(TO_DATE( CM_DATE||' '||CM_TIME,'dd-mm-yyyy hh24:mi:ss'),'dd-mm-yyyy hh24:mi:')||
              TO_CHAR(TO_CHAR(TO_DATE( CM_TIME,'hh24:mi:ss'),'ss')-1) ,'dd-mm-yyyy hh24:mi:ss') cr_date
    FROM DBMG_SAL_HEAD ;
    o/p is correct as i need..
    10/20/0010 1:09:05 PM
    now i m inserting ..
    insert into pc
    select TO_DATE( TO_CHAR(TO_DATE( CM_DATE||' '||CM_TIME,'dd-mm-yyyy hh24:mi:ss'),'dd-mm-yyyy hh24:mi:')||
              TO_CHAR(TO_CHAR(TO_DATE( CM_TIME,'hh24:mi:ss'),'ss')-1) ,'dd-mm-yyyy hh24:mi:ss') cr_date
    FROM DBMG_SAL_HEAD ;
    commit;
    select * from pc;
    o/p is
    20-OCT-10 0 why data is coimng like this ??i need it in 10/20/0010 1:09:05 PM format...
    plz fix it..........!
    rgds,
    pc

    PC wrote:
    hi ,
    I am 1 table as below.               
    CREATE TABLE DBMG_SAL_HEAD (
    CM_DATE DATE,
    CM_TIME VARCHAR2(8),
    Bad design from the get-go. DATE data holds both date and time. No justification for having a separate column for time only.
    data in table is..
    CM_DATE     CM_TIME
    10/20/2010 13:09:06
    now I need to insert above data in below table with 1 second less time ex-10/20/2010 13:09:05
    CREATE TABLE PC (
    COL1 TIMESTAMP(9)
    how I can do it??
    I tried below query but when I entered in table its not showing correct format..I m using oracle 11g.
    select TO_DATE( TO_CHAR(TO_DATE( CM_DATE||' '||CM_TIME,'dd-mm-yyyy hh24:mi:ss'),'dd-mm-yyyy hh24:mi:')||
              TO_CHAR(TO_CHAR(TO_DATE( CM_TIME,'hh24:mi:ss'),'ss')-1) ,'dd-mm-yyyy hh24:mi:ss') cr_date
    FROM DBMG_SAL_HEAD ;
    o/p is correct as i need..
    10/20/0010 1:09:05 PM
    now i m inserting ..
    insert into pc
    select TO_DATE( TO_CHAR(TO_DATE( CM_DATE||' '||CM_TIME,'dd-mm-yyyy hh24:mi:ss'),'dd-mm-yyyy hh24:mi:')||
              TO_CHAR(TO_CHAR(TO_DATE( CM_TIME,'hh24:mi:ss'),'ss')-1) ,'dd-mm-yyyy hh24:mi:ss') cr_date
    FROM DBMG_SAL_HEAD ;
    commit;
    select * from pc;
    o/p is
    20-OCT-10 0 why data is coimng like this ??i need it in 10/20/0010 1:09:05 PM format...
    plz fix it..........!You fix it.
    You need to understand that DATE and TIMESTAMP are stored in oracle's internal format. They are presented in a query with the formatting specified by the controlling setting of your NLS parameters. That's what to_char and to_date are all about.
    ============================================================================
    If the column is defined as a DATE, then the data is NOT+ stored in the format you stated. Oracle always stores dates in its own internal format.
    What you are referring to is the character string representation of the data. That presentation is controlled by the controlling setting of NLS_DATE_FORMAT. The reason I say the "controlling" setting is because NLS_DATE_FORMAT can be set in any of several different locations, each with its own scope of influence. That being the case you should never depending on a setting outside of your own control. That means you will always do one of the following:
    1) ALTER SESSION SET NLS_DATE_FORMAT='whatever format mask you want';
    or
    2) Proper use of TO_CHAR and TO_DATE at the individual sql statement
    I prefer the second, so there is never any ambiguity or question when looking at an individual SQL statement.
    Just to drive home the point:
    SQL> create table mytable (mydate date);
    Table created.
    SQL> insert into mytable values (to_date('09-15-2010','mm-dd-yyyy'));
    1 row created.
    SQL> insert into mytable values (to_date('20091225','yyyymmdd'));
    1 row created.
    SQL> insert into mytable values (to_date('1970-06-25 15:45:37','yyyy-mm-dd hh24:mi:ss'));
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select * from mytable;
    MYDATE
    15-SEP-10
    25-DEC-09
    25-JUN-70
    3 rows selected.
    SQL> select to_char(mydate,'yyyymmdd') from mytable;
    TO_CHAR(
    20100915
    20091225
    19700625
    3 rows selected.
    SQL> select to_char(mydate,'mmMON-dd-yy hh:mi') from mytable;
    TO_CHAR(MYDATE,'MMM
    09  SEP-15-10 12:00
    12  DEC-25-09 12:00
    06  JUN-25-70 03:45
    3 rows selected.
    SQL> select to_char(mydate,'MON-dd-yyyy hh24:mi') from mytable;
    TO_CHAR(MYDATE,'M
    SEP-15-2010 00:00
    DEC-25-2009 00:00
    JUN-25-1970 15:45
    3 rows selected.
    SQL> Through any of the above, was the format of the stored data ever changed?
    >
    rgds,
    pc

  • DSA Input buffer overwritte​n error

    Hi All,
    i am using 5 PXI 4472 and one FPGA 7833R. I need to perform continuous acquisition on all 40 DSA and 8 FPGA channels at maximum of 51.2 KiloSamples/second and 102400 Samples/block
    In my code i am having a while loop which will acquire dsa and FPGA data and write to file. When i consider 51.2 Kilo samples and 102400 Samples/block, the overall while loop delay is between 2000 to 2100 milli seconds. Meaning the file write and other calculations are getting compelted within 100 mseconds. i have assigned the buffer size of 1 Msamples/channels for DSA.
    I believe that even if the PC RAM size is greater, DSA allows only 1Msamples/channel buffer to allocate.
    I have also assigned 1.7 MB for FPGA DMA transfer host memory.
    When i acquire data simultanelously from FPGA and DSA in a single while loop, the acquisition is happening fine some 15 minutes after which DSA is giving DAQ channels overwritten error.
    But when i bypass FPGA acquisition, there is no daq error generated and acquisition is happening fine.
    i am have upgraded LV to 8.0.1
    Please share thoughts and experiences to get this solved.
    Thanks,
    Sudha

    The first thing I would try would be to update your RIO version to RIO 2.0.1.
    http://digital.ni.com/softlib.nsf/websearch/12E9CA​0820A192F08625714F005A8B0C?opendocument&node=13206​...
    Note this is an update to RIO 2.0 not a full RIO installation.  Also note that this is not RIO 2.0.2.  I do not think you need RIO 2.0.2.
    Reasoning:  Reading data from the FPGA DMA channel is a blocking method when using RIO 2.0.  That means it will consume 100% of the CPU while trying to get the data.  This is obviously fast, but can starve other high priority operations like reading other DAQ channels.  RIO 2.0.1 allows the Read to sleep while rating for the data to arrive.
    This may not entirely solve your problem, but it is a start and the update should definitely help a little.
    Regards,
    Joseph D.
    National Instruments

  • Nome evento problema:     APPCRASH   Nome applicazione:     iTunes.exe   Versione applicazione:     11.1.4.62   Timestamp applicazione:     52ddbf7a   Nome modulo con errori:     StackHash_1025   Versione modulo con errori:     6.0.6002.18881   Timestamp

    Nome evento problema:    APPCRASH
      Nome applicazione:    iTunes.exe
      Versione applicazione:    11.1.4.62
      Timestamp applicazione:    52ddbf7a
      Nome modulo con errori:    StackHash_1025
      Versione modulo con errori:    6.0.6002.18881
      Timestamp modulo con errori:    51da3e27
      Codice eccezione:    c0000374
      Offset eccezione:    000b06fc
      Versione SO:    6.0.6002.2.2.0.256.6
      ID impostazioni locali:    1040
      Informazioni aggiuntive 1:    1025
      Ulteriori informazioni 2:    491df5c84464b99ef8f61e542b4c9ce4
      Ulteriori informazioni 3:    9ade
      Ulteriori informazioni 4:    7f7b00bf8397fb6909813f504d7de648

    Nome evento problema:
    APPCRASH
      Nome applicazione:
    Adobe Premiere Pro.exe
      Versione applicazione:
    5.0.3.0
      Timestamp applicazione:
    4ce382d1
      Nome modulo con errori:
    StackHash_8659
      Versione modulo con errori:
    6.1.7601.18247
      Timestamp modulo con errori:
    521eaf24
      Codice eccezione:
    c0000374
      Offset eccezione:
    00000000000c4102
      Versione SO:
    6.1.7601.2.1.0.256.48
      ID impostazioni locali:
    1040
      Informazioni aggiuntive 1:
    8659
      Ulteriori informazioni 2:
    865914f2b0aba0a7f763eb3f160d9c9b
      Ulteriori informazioni 3:
    5def
      Ulteriori informazioni 4:
    5defc8630ba14d9bda1ae72188ae0e03@
    E' da quando ho istallato il plug-in Heroglift della Pro-Dad
    Fino ad ora con istallati: Magic Bullet, New Blue, ProDad Vitascene e Boris non mi aveva dato eccessivi problemi.
    Qualcuno ha avuto la stessa triste esperienza? Grazie e spero qualcuno possa aiutarmi

  • Error while scheduling a Hyperion Interactive Report

    Hi,
    I'm trying to run a scheduled Hyperion Interactive Report (version 9.3.1). The report is scheduled to run for 1st of every month. After the report is scheduled. When a user tries to run the report on demand.
    The error log states as follows:
    $ view server_messages_IRJobService.log
    "server_messages_IRJobService.log" [Read only] 38 lines, 2688 characters
    <event logger="com.brio.one.services.bqservice.SERVICE" method="ThrID(7) Logger(ZDbgPrint)" timestamp="1247269603142" level="ERROR" thread="[ORB=_it_orb_id_1,Pool=1]::id-6" sequence_no="74">^M
    <time>10 Jul 2009 16:46:43,142</time>^M
    <context subject="TIPAdmin" session_id="zRMz99t6-00000122670ef701-0000-8197-0a3f0708" originator_type="IRJobService" originator_name="InteractiveReportingService" host="tsbrit02">^M
    <info type="RESOURCE">IBQServiceImpl::runJob Job Identifier: TIPPFSMRPT Service Name: JF1_tsbrit02 Cycle Name: Cycle_0</info>^M
    <info type="RESOURCE_ID">000001225ef102a5-0000-8197-0a3f0708</info>^M
    </context>^M
    <message><![CDATA[TCApp::ExecuteJavaScript failed: ]]></message>^M
    </event>^M
    ^M
    But when the report is uploaded and when the user tries running it. It runs without any problem.
    The report has 3 filters(drop downs) on runtime.
    Could anyone please help me in this regard.
    Regards,
    Sadiq

    check your discoverer user role using:
    select granted_role from dba_role_privs where grantee=upper('<username>');
    from your database. the database to which the desktop is connecting.
    check for connect, resouce, multiorg roles.

  • Many db objects error after Upgrate EBS 12.1.1 to EBS 12.1.3

    Hi,
    recently we have upgraded our EBS from 12.1.1 to 12.1.3 and there are almost 777 invalid objects after recompiling using adadmin.
    I tried to recompile INVALID objects using utlrp.sql but I am getting below error.
    SQL> @utlrp.sql
    SELECT dbms_registry_sys.time_stamp('utlrp_bgn') as timestamp from dual
    ERROR at line 1:
    ORA-00904: "DBMS_REGISTRY_SYS"."TIME_STAMP": invalid identifier
    DOC> The following PL/SQL block invokes UTL_RECOMP to recompile invalid
    DOC> objects in the database. Recompilation time is proportional to the
    DOC> number of invalid objects in the database, so this command may take
    DOC> a long time to execute on a database with a large number of invalid
    DOC> objects.
    DOC>
    DOC> Use the following queries to track recompilation progress:
    DOC>
    DOC> 1. Query returning the number of invalid objects remaining. This
    DOC> number should decrease with time.
    DOC> SELECT COUNT(*) FROM obj$ WHERE status IN (4, 5, 6);
    DOC>
    DOC> 2. Query returning the number of objects compiled so far. This number
    DOC> should increase with time.
    DOC> SELECT COUNT(*) FROM UTL_RECOMP_COMPILED;
    DOC>
    DOC> This script automatically chooses serial or parallel recompilation
    DOC> based on the number of CPUs available (parameter cpu_count) multiplied
    DOC> by the number of threads per CPU (parameter parallel_threads_per_cpu).
    DOC> On RAC, this number is added across all RAC nodes.
    DOC>
    DOC> UTL_RECOMP uses DBMS_SCHEDULER to create jobs for parallel
    DOC> recompilation. Jobs are created without instance affinity so that they
    DOC> can migrate across RAC nodes. Use the following queries to verify
    DOC> whether UTL_RECOMP jobs are being created and run correctly:
    DOC>
    DOC> 1. Query showing jobs created by UTL_RECOMP
    DOC> SELECT job_name FROM dba_scheduler_jobs
    DOC> WHERE job_name like 'UTL_RECOMP_SLAVE_%';
    DOC>
    DOC> 2. Query showing UTL_RECOMP jobs that are running
    DOC> SELECT job_name FROM dba_scheduler_running_jobs
    DOC> WHERE job_name like 'UTL_RECOMP_SLAVE_%';
    DOC>#
    DECLARE
    ERROR at line 1:
    ORA-04067: not executed, package body "APPS.UTL_RECOMP" does not exist
    ORA-06508: PL/SQL: could not find program unit being called: "APPS.UTL_RECOMP"
    ORA-06512: at line 4
    SELECT dbms_registry_sys.time_stamp('utlrp_end') as timestamp from dual
    ERROR at line 1:
    ORA-00904: "DBMS_REGISTRY_SYS"."TIME_STAMP": invalid identifier
    PL/SQL procedure successfully completed.
    DOC> The following query reports the number of objects that have compiled
    DOC> with errors (objects that compile with errors have status set to 3 in
    DOC> obj$). If the number is higher than expected, please examine the error
    DOC> messages reported with each object (using SHOW ERRORS) to see if they
    DOC> point to system misconfiguration or resource constraints that must be
    DOC> fixed before attempting to recompile these objects.
    DOC>#
    select COUNT(*) "OBJECTS WITH ERRORS" from obj$ where status = 3
    ERROR at line 1:
    ORA-00942: table or view does not exist
    DOC> The following query reports the number of errors caught during
    DOC> recompilation. If this number is non-zero, please query the error
    DOC> messages in the table UTL_RECOMP_ERRORS to see if any of these errors
    DOC> are due to misconfiguration or resource constraints that must be
    DOC> fixed before objects can compile successfully.
    DOC>#
    select COUNT(*) "ERRORS DURING RECOMPILATION" from utl_recomp_errors
    ERROR at line 1:
    ORA-00942: table or view does not exist
    DECLARE
    ERROR at line 1:
    ORA-00942: table or view does not exist
    ORA-06512: at line 31
    BEGIN dbms_registry_sys.validate_components; END;
    ERROR at line 1:
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DBMS_REGISTRY_SYS.VALIDATE_COMPONENTS' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    Please advise for further.
    Thanks in advance,
    Nish

    950358 wrote:
    Thanks Husain,
    I have run utlrp.sql but still there 777 invalid objects available in my system.
    Please advise.
    Thanks,
    NishWhat is the error you get when you compile those objects?
    How many invalid objects you have under each schema?
    Please see old threads for the invalid objects MOS docs (after R12 upgrade and generic ones).
    https://forums.oracle.com/forums/search.jspa?threadID=&q=Invalid+AND+Objects+AND+12.1.3&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    https://forums.oracle.com/forums/search.jspa?threadID=&q=Invalid+AND+Objects+AND+R12&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Thanks,
    Hussein

  • Error in BI publisher report for OIM

    Hi all!
    Can u help me?
    I know that this error is quite common, but I`ve tried out all the soultions and still can`t get the result.
    I`ve installed BI publisher and attached OIM reports there. All the reports work properly except those, which have date properties. For example, Account Activity in Resource. All of them return ora-08143 error ("not a valid month").
    Let me explain situation more clear.
    There are two fields using date in the report creation menu.
    Date Range From:
    Date Range To;
    They automatically get value like "12/06/2011 05:05:05" using their default values ({$SYSDATE()$}).
    But if I use these values, I get an error.
    I`ve found two ways of solving a problem: 1.) Changing the date to something like "12/Jan/2011" manually (note, that by default month stands first, and to solve the problem I should write it second and NOT in digits). 2.) Changing the date mask in report properties to dd/MMM/yyyy.
    But it`s not the best way to solve the problem. In this case I should change these properties everythere. Or user should (and this is bad :) ). I`m sure, that these reports should work without such modifications, so, error borns on my side.
    What I`ve done:
    1.) I tried to change nls_date_format, nls_language, nls_territory, timestamp_format in database session to different formats.
    2.) I`ve tried to manage all the changes that I`ve told about above.
    3.) I`ve tried to change UI language and report locales (if it`s important, I use russian language and russian report locale). xlf files are all ok.
    What shall I do to fix this error?
    My db is: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    My BI publisher is: 10.1.3.4 (may be everything is bad because of version, but it doesn`t seem to be so, because every report except date reports work nice).
    Thank u very muck and looking forward for hearing from u.

    Dewan.Rajiv wrote:
    I don know what issue you are facing due to language.
    Have you tried with changing the query of the report ?Yes, I have.
    By the way, I`ve discovered, that the error appears in line:
    AND myvariablewithdate BETWEEN :p_date_DateFromRange AND :p_date_DateToRange
    If I comment such a lines in the query, everthing works OK.
    But as you understand, that`s not the point :(
    Even if i put to data forms information from sql database (just copy timestamp data), the error continues to appear.
    If i make something like that in sqldeveloper, everything is ok:
    select upa_resource.upa_res_eff_from_date from upa_resource where upa_res_eff_from_date between (select sysdate - 30 from dual) and (select sysdate from dual)
    this statement returns right values.
    So, the problem is somethere in such a chain:
    - select sysdate from dual (it`s ok)
    - put sysdate to form (does it make any type of convert?)
    - take variable (former sysdate) from form.

  • Error in BI publisher report

    Hi all!
    Can u help me?
    I know that this error is quite common, but I`ve tried out all the soultions and still can`t get the result.
    I`ve installed BI publisher and attached OIM reports there. All the reports work properly except those, which have date properties. For example, Account Activity in Resource. All of them return ora-08143 error ("not a valid month").
    Let me explain situation more clear.
    There are two fields using date in the report creation menu.
    Date Range From:
    Date Range To;
    They automatically get value like "12/06/2011 05:05:05" using their default values ({$SYSDATE()$}).
    But if I use these values, I get an error.
    I`ve found two ways of solving a problem: 1.) Changing the date to something like "12/Jan/2011" manually (note, that by default month stands first, and to solve the problem I should write it second and NOT in digits). 2.) Changing the date mask in report properties to dd/MMM/yyyy.
    But it`s not the best way to solve the problem. In this case I should change these properties everythere. Or user should (and this is bad :) ). I`m sure, that these reports should work without such modifications, so, error borns on my side.
    What I`ve done:
    1.) I tried to change nls_date_format, nls_language, nls_territory, timestamp_format in database session to different formats.
    2.) I`ve tried to manage all the changes that I`ve told about above.
    3.) I`ve tried to change UI language and report locales (if it`s important, I use russian language and russian report locale). xlf files are all ok.
    What shall I do to fix this error?
    My db is: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    My BI publisher is: 10.1.3.4 (may be everything is bad because of version, but it doesn`t seem to be so, because every report except date reports work nice).
    Thank u very muck and looking forward for hearing from u.

    Dewan.Rajiv wrote:
    I don know what issue you are facing due to language.
    Have you tried with changing the query of the report ?Yes, I have.
    By the way, I`ve discovered, that the error appears in line:
    AND myvariablewithdate BETWEEN :p_date_DateFromRange AND :p_date_DateToRange
    If I comment such a lines in the query, everthing works OK.
    But as you understand, that`s not the point :(
    Even if i put to data forms information from sql database (just copy timestamp data), the error continues to appear.
    If i make something like that in sqldeveloper, everything is ok:
    select upa_resource.upa_res_eff_from_date from upa_resource where upa_res_eff_from_date between (select sysdate - 30 from dual) and (select sysdate from dual)
    this statement returns right values.
    So, the problem is somethere in such a chain:
    - select sysdate from dual (it`s ok)
    - put sysdate to form (does it make any type of convert?)
    - take variable (former sysdate) from form.

  • ORA-00001: unique constraint error..

    Hi There,
    We were trying to do an insert when we started having ORA-00001: unique constraint error.. to speed our testing we decided to disable all the constraints on the table; however we still having the same issue.
    How can we resolve this please.
    SQL> select constraint_name,constraint_type,status from dba_constraints where table_name='MEMBER_LATEST';
    CONSTRAINT_NAME                C STATUS
    MEMBER_LATEST_PK               P DISABLED
    SYS_C0017577                   C DISABLED
    SYS_C0017576                   C DISABLED
    SYS_C0017575                   C DISABLED
    SYS_C0017574                   C DISABLED
    SYS_C0017573                   C DISABLED
    SYS_C0017572                   C DISABLED
    SYS_C0017571                   C DISABLED
    SYS_C0017570                   C DISABLED
    MEMBER_LATEST_FK               R DISABLED
    10 rows selected.
    SQL>
    SQL>
    SQL>     INSERT INTO MEMBER_LATEST (DIS_ID, TIMESTAMP, LAST_NAME, FIRST_NAME, MIDDLE_NAME, DIS_COUNT)
      2    SELECT DIS_ID, 'TEST', LAST_NAME, FIRST_NAME, MIDDLE_NAME, 0
      3    FROM MV_DIS_MEM, MV_DIS_COUNT
      4    WHERE MV_DIS_MEM.P_CODE =  MV_DIS_COUNT.P_CODE
      5    ORDER BY 1,3,4;
        INSERT INTO MEMSCH.MEMBER_LATEST (DIS_ID, TIMESTAMP, LAST_NAME, FIRST_NAME,
    ERROR at line 1:
    ORA-00001: unique constraint (MEMSCH.MEMBER_LATEST_PK) violated
    SQL>Anything else we can do please?
    Thanks

    rsar001 wrote:
    but isn't the unique index constraint part of the disabled constraints on the table as shown above?Not if index used by PK was created separately prior to PK:
    SQL> create table emp1 as select * from emp;
    Table created.
    SQL> alter table emp1
      2  add constraint emp1_pk
      3  primary key(empno);
    Table altered.
    SQL> insert into emp1 select * from emp;
    insert into emp1 select * from emp
    ERROR at line 1:
    ORA-00001: unique constraint (SCOTT.EMP1_PK) violated
    SQL> alter table emp1 disable primary key;
    Table altered.
    SQL> insert into emp1 select * from emp;
    14 rows created.
    SQL> rollback;
    Rollback complete.
    SQL> alter table emp1 drop primary key;
    Table altered.
    SQL> create unique index emp1_pk on emp1(empno);
    Index created.
    SQL> alter table emp1
      2  add constraint emp1_pk
      3  primary key(empno)
      4  using index emp1_pk;
    Table altered.
    SQL> insert into emp1 select * from emp;
    insert into emp1 select * from emp
    ERROR at line 1:
    ORA-00001: unique constraint (SCOTT.EMP1_PK) violated
    SQL> alter table emp1 disable primary key;
    Table altered.
    SQL> insert into emp1 select * from emp;
    insert into emp1 select * from emp
    ERROR at line 1:
    ORA-00001: unique constraint (SCOTT.EMP1_PK) violated
    SQL> But by dropping index you are simply delaying the issue. Yes, you will be able to insert, but what's then? You will not be able to recreate PK - same violation error will be raised.
    SY.

  • Error installing zabo 6.5 sp4 on Windows Vista

    When we try to install ZABO 6.5 sp4 on Windows Vista we reach the error:
    Firma problema:
    Nome evento problema:           APPCRASH
    Nome applicazione:              iexplore.exe
    Versione applicazione:          7.0.6000.16473
    Timestamp applicazione:         46296d48
    Nome modulo con errori:         StackHash_8a90
    Versione modulo con errori:     6.0.6000.16386
    Timestamp modulo con errori:    4549bdc9
    Codice eccezione:               c0000374
    Offset eccezione:               000af1c9
    Versione SO:                    6.0.6000.2.0.0.256.6
    ID impostazioni locali:         1040
    Informazioni aggiuntive 1:      8a90
    Ulteriori informazioni 2:       1b8ee1960dafacfcb7a25dc1d907e2bf
    and reports don't run.

    hi
    the point is that even if you get this somehow to work your customer will work with a version which is no longer supported in a an environment which was NEVER supported.Your customer must be aware of the risks that result from such a decision.
    I would recommend either to upgrade your BO version or to downgrade the windows clients.a solution maybe a terminal server installation of the bo client.
    sorry that i cannot provide you with a solution but you have to be aware of the risks you are facing.
    regards
    stratos

  • The system cannot find the path specified.---- Error code:-2147467259 Error code name:failed

    Friends,
                I'm facing below issue while accessing a crystal report. I heard it might be an access issue. But i'm able to access a report within a same folder.
    com.crystaldecisions.sdk.occa.report.lib.ReportSDKServerException : The system cannot find the path specified.---- Error code:-2147467259 Error code name:failed
    at com.crystaldecisions.sdk.occa.report.application.PrintOutputController.export(Unknown Source)
    at com.crystaldecisions.sdk.occa.report.application.PrintOutputController.export(Unknown Source)
    Any suggestions would be very helpful.
    Thanks in Advance,
    Bharath

    Apply trace on RAS and look for errors in the RAS logs at the same timestamp.
    These errors come for various errors and tracing RAS would give us a better idea.
    It might hapen that you observe these errors for reports which takes more than the RAS timeout to export the reports.
    Post getting the RAS logs you can try below steps if we see timeout errors in RAS logs.
    Copy clientSDKOptions.xml file from <BO install path>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\java\lib to the machine where your app is running Open clientSDKOptions.xml and change CORBARequestTimeOut from 600000 to a value large enough to allow the report to render, for instance 1200000. Default value is 600000 (10 minutes)
    Specify the location of the clientSDKOptions.xml file at run-time. In your JSP or Java files, use the Java method setProperty from the System class. Set the system property indicated by the ras.config key to the specified directory as: system.setProperty("ras.config","c:\temp"). This function call specifies that the clientSDKOptions.xml file in c:\temp is to be used for locating RAS servers.
    Thanks,
    Prithvi

Maybe you are looking for

  • Ical color

    I have two calendars setup and in the Year view. Each calendar is set to a different color but every event from both calendars shows up as yellow. I'd like to have one calendars' events show up as one color while each different calendar to show up in

  • Attempting upgrade to snow leopard macbook pro

    installing snow leopard on black macbook pro and installation stopped. machine will not shut down or restart. screen displays normal background with no headers or icons of any kind.

  • Plz Plz Help:

    Hi all i have posted a question before on Anonymous server as well. Can any one plz tell me , how can i implement Anonymous Server in java. Plzplz help its urgent.. Thanks in anticipation regards waqas

  • How to play indeo IV41 on lion

    I have a few old movie with indeo IV41 codec. How can I see them on Lion? Thanks

  • Spry vertical menu width

    I can adjust the menu and sumenu widths but there is an outer container for the main menu which is also the border. I can not find where to change it: The link explains it much faster: http://spartaumc.com/index14.html