Problems with Date bind variable

I'm trying to use a Date as a bind variable in a View Object, and I'm running into difficulties. I've got the bind variable type set to date, and the control hints for Format Type and Format set to "Simple Date" and "yyyy-MM-dd" respectively. When I run the application module test and enter the date in the correct format, the View Object returns the expected results. However, when I try to set a Query Bind Parameters property for a JHeadstart group in my application definition in order to set the bind variable's value from an dateField input control, I get the following exception:
30-Jan 08:49:56 ERROR (ErrorReportingUtils) -java.lang.IllegalArgumentException
at java.sql.Date.valueOf(Date.java:104)
at oracle.jbo.domain.Date.toDate(Date.java:348)
at oracle.jbo.domain.Date.<init>(Date.java:279)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at oracle.jbo.domain.TypeConvMapEntry.convert(TypeConvMapEntry.java:73)
at oracle.jbo.domain.TypeFactory.get(TypeFactory.java:739)
at oracle.jbo.domain.TypeFactory.getInstance(TypeFactory.java:90)
at oracle.jbo.common.VariableImpl.convertToJava(VariableImpl.java:546)
at oracle.jbo.common.VariableValueManagerImpl.doSetVariableValue(VariableValueManagerImpl.java:182)
at oracle.jbo.common.VariableValueManagerImpl.setVariableValue(VariableValueManagerImpl.java:223)
at oracle.jbo.common.VariableValueManagerImpl.setVariableValue(VariableValueManagerImpl.java:229)
at oracle.jheadstart.model.adfbc.v2.JhsApplicationModuleImpl.applyBindParams(JhsApplicationModuleImpl.java:173)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:585)
at oracle.adf.model.binding.DCInvokeMethod.invokeMethod(DCInvokeMethod.java:507)
at oracle.adf.model.binding.DCDataControl.invokeMethod(DCDataControl.java:1795)
at oracle.adf.model.bc4j.DCJboDataControl.invokeMethod(DCJboDataControl.java:1989)
at oracle.adf.model.binding.DCInvokeMethod.callMethod(DCInvokeMethod.java:219)
at oracle.jbo.uicli.binding.JUCtrlActionBinding.doIt(JUCtrlActionBinding.java:1289)
at oracle.adf.model.binding.DCDataControl.invokeOperation(DCDataControl.java:1802)
at oracle.jbo.uicli.binding.JUCtrlActionBinding.invoke(JUCtrlActionBinding.java:627)
at oracle.adf.model.binding.DCInvokeActionDef$DCInvokeAction.refresh(DCInvokeActionDef.java:140)
at oracle.adf.model.binding.DCBindingContainer.internalRefreshControl(DCBindingContainer.java:2521)
at oracle.adf.model.binding.DCBindingContainer.refresh(DCBindingContainer.java:2260)
at oracle.adf.controller.v2.lifecycle.PageLifecycleImpl.prepareRender(PageLifecycleImpl.java:534)
at oracle.adf.controller.faces.lifecycle.FacesPageLifecycle.prepareRender(FacesPageLifecycle.java:98)
at oracle.jheadstart.controller.jsf.lifecycle.JhsPageLifecycle.prepareRender(JhsPageLifecycle.java:1155)
at oracle.adf.controller.v2.lifecycle.Lifecycle$1.execute(Lifecycle.java:297)
at oracle.adf.controller.v2.lifecycle.Lifecycle.executePhase(Lifecycle.java:116)
at oracle.adf.controller.faces.lifecycle.ADFPhaseListener.mav$executePhase(ADFPhaseListener.java:29)
at oracle.adf.controller.faces.lifecycle.ADFPhaseListener$1.before(ADFPhaseListener.java:426)
at oracle.adf.controller.faces.lifecycle.ADFPhaseListener.beforePhase(ADFPhaseListener.java:77)
at com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:228)
at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:137)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:214)
at com.evermind.server.http.ResourceFilterChain.doFilter(ResourceFilterChain.java:65)
at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:162)
at com.evermind.server.http.EvermindFilterChain.doFilter(EvermindFilterChain.java:15)
at oracle.adfinternal.view.faces.webapp.AdfFacesFilterImpl._invokeDoFilter(AdfFacesFilterImpl.java:228)
at oracle.adfinternal.view.faces.webapp.AdfFacesFilterImpl._doFilterImpl(AdfFacesFilterImpl.java:197)
at oracle.adfinternal.view.faces.webapp.AdfFacesFilterImpl.doFilter(AdfFacesFilterImpl.java:123)
at oracle.adf.view.faces.webapp.AdfFacesFilter.doFilter(AdfFacesFilter.java:103)
at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:621)
at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:370)
at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:871)
at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:453)
at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:221)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:122)
at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:111)
at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
at java.lang.Thread.run(Thread.java:595)
However, it seems that the value of the bind parameter did in fact change, according to the debug message issued by the JhsApplicationModuleImpl.applyBindParams(...) procedure:
30-Jan 08:46:23 DEBUG (JhsApplicationModuleImpl) -ViewObject ViewObj: value of bind param move_date has changed: old value=null ,new value=Wed Jan 07 00:00:00 CST 2009
One thing I have noticed that may be of interest - if I run the debugger into the JhsApplicationModuleImpl.applyBindParams(...) procedure, a default date in my bind variable shows up as an oracle.jbo.domain.Date object when it comes out of the following line:
Object oldValue = vo.ensureVariableManager().getVariableValue(key);
as opposed to the new value that gets sent from the input value of the dateField control, which shows up as a java.util.Date in the following line:
Object value = args.get(key);
In the string of functions leading up to the following in the stack trace:
               at java.sql.Date.valueOf(Date.java:104)
at oracle.jbo.domain.Date.toDate(Date.java:348)
at oracle.jbo.domain.Date.<init>(Date.java:279)
it looks like the value of the java.util.Date is converted to a string, then passed to these functions to be turned into a java.sql.Date object. However, according to the java.sql.Date.valueOf(...) function, it's expecting date in "yyyy-mm-dd" format, per the header comments:
* Converts a string in JDBC date escape format to
* a <code>Date</code> value.
* @param s a <code>String</code> object representing a date in
* in the format "yyyy-mm-dd"
* @return a <code>java.sql.Date</code> object representing the
* given date
* @throws IllegalArgumentException if the date given is not in the
* JDBC date escape format (yyyy-mm-dd)
When the input value from the dateField input control is added as a java.util.Date, the toString() is producing the wrong date format, which could be seen in the debug output I included above that said the bind variable has changed.
Has anyone done this before that can let me know what I am missing and how to fix this? I've search all over the web, and all the articles I've seen relating to bind variables only refer to those of numeric or string types, which aren't giving me any trouble. Even the Developer's Guide (http://database.in2p3.fr/doc/oracle/Oracle_Application_Server_10_Release_3/web.1013/b25947/bcquerying009.htm) article "5.9 Using Named Bind Variables" doesn't reference using a Date type.

I had the same problem about a year ago (using JHeadstart 10.1.3.2.52):
15-feb 14:10:31 DEBUG (JhsApplicationModuleImpl) -Search item matches query bind param SubjectenKernPeildatum, value set to 2000-01-01
15-feb 14:10:33 DEBUG (JhsApplicationModuleImpl) -Executing applyBindParams for BasServiceKern.AdresRollenSubjecten
15-feb 14:10:33 DEBUG (JhsApplicationModuleImpl) -ViewObject AdresRollenSubjecten: value of bind param peildatum has changed: old value=null ,new value=Sat Jan 01 00:00:00 CET 2000
15-feb 14:10:33 ERROR (ErrorReportingUtils) -java.lang.IllegalArgumentException
at java.sql.Date.valueOf(Date.java:104)
at oracle.jbo.domain.Date.toDate(Date.java:348)
It appears a java.util.Date gets put on the criteria Map in the searchBean, this can't be used as input for a jbo.Date bindvariable.
My ugly solution was to override public void applyBindParams(String voUsage,HashMap args)
in MyProjectApplicationModuleImp (which extends JhsApplicationModuleImpl of course),
check if value is a java.util.Date, and then
value = new java.sql.Date(((java.util.Date)value).getTime());
Greetings, HJ
Edited by: HJHorst on Feb 6, 2009 12:22 AM

Similar Messages

  • User Report data bind variable problems

    Hello,
    I am trying to build a simple report in SQLDeveloper, using a DATE bind variable. But it fails with the error message:
    Inconsistent datatype, expected DATE got NUMBER
    SELECT * FROM FLOW_COMP_REC_SUMMARY_RU
    where create_date = trunc(:TARGET_DATE)
    and type = 'D'
    ORDER BY MESSAGE_FLOW_ID
    Running this query in SQLDeveloper worksheet executes properly after a dialog box pops up and I enter: current_date - 1
    SELECT * FROM FLOW_COMP_REC_SUMMARY_RU
    where create_date = trunc(&TARGET_DATE)
    and type = 'D'
    ORDER BY MESSAGE_FLOW_ID
    I've searched but no clear solutions present themselves. If you have one it would be most appreciated.
    Thanks

    You'll have to convert the input to date yourself:
    SELECT * FROM FLOW_COMP_REC_SUMMARY_RU
    where create_date = TO_DATE(:TARGET_DATE, 'DD/MM/YYYY')
    and type = 'D'
    ORDER BY MESSAGE_FLOW_ID
    Have fun,
    K.

  • How do I refresh a table with a bind variable using a return listener?

    JDev 11.1.2.1.0.
    I am trying to refresh a table with a bind variable after a record is added.
    The main page has a button which, on click, calls a task flow as an inline document. This popup task flow allows the user to insert a record. It has its own transaction and does not share data controls.
    Upon task flow return, the calling button's return dialog listener is invoked which should allow the user to see the newly created item in the table. The returnListener code:
        // retrieve the bind variable and clear it of any values used to filter the table results
        BindingContainer bindings = ADFUtils.getBindings();
        AttributeBinding attr = (AttributeBinding)bindings.getControlBinding("pBpKey");
        attr.setInputValue("");
        // execute the table so it returns all rows
        OperationBinding operationBinding = bindings.getOperationBinding("ExecuteWithParams");
        operationBinding.execute();
        // set the table's iterator to the newly created row
        DCIteratorBinding iter = (DCIteratorBinding) bindings.get("AllCustomersIterator");
        Object customerId = AdfFacesContext.getCurrentInstance().getPageFlowScope().get("newCustomerId");
        iter.setCurrentRowWithKeyValue((String)customerId);
        // refresh the page
        AdfFacesContext.getCurrentInstance().addPartialTarget(this.getFilterText());
        AdfFacesContext.getCurrentInstance().addPartialTarget(this.getCustomerTable());But the table does not refresh ... The bind variable's inputText component is empty. The table flickers as if it updates. But no new values are displayed, just the ones that were previously filtered or shown.
    I can do the EXACT SAME code in a button's actionListener that I click manually and the table will refresh fine. I'm really confused and have spent almost all day on this problem.
    Will

    Both options invoke the create new record task flow. The first method runs the "reset" code shown above through the calling button's returnListener once the task flow is complete. The second method is simply a button which, after the new record is added and the task flow returns, runs the "reset" code by my clicking it manually.
    I'm thinking that the returnListener code runs before some kind of automatic ppr happens on the table. I think this because the table contents flicker to show all customers (like I intend) but then goes back to displaying the restricted contents a split second later.
    Yes, the table is in the page that invokes the taskflow.
    Here are some pictures:
    http://williverstravels.com/JDev/Forums/Threads/2337410/Step1.jpg
    http://williverstravels.com/JDev/Forums/Threads/2337410/Step2.jpg
    http://williverstravels.com/JDev/Forums/Threads/2337410/Step3.jpg
    Step1 - invoke new record task flow
    Step2 - enter data and click Finish
    Step3 - bind parameter / table filter cleared. Table flickers with all values. Table reverts to previously filterd values.

  • Having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this

    having a problem with dates when I send my numbers doc to excel. dates are all out and that they have to cut and paste individual entries onto their spreadsheet. Any idea how I can prevent this.
    I'm using Lion on an MBP and Numbers is the latest version

    May you give more details about what is wrong with your dates ?
    M…oSoft products aren't allowed on my machines but I use LibreOffice which is a clone of Office.
    When I export from Numbers to Excel and open the result with LibreOffice, the dates are correctly treated.
    To be precise, dates after 01/01/1904 are correctly treated. dates before 01/01/1904 are exported as strings but, as it's flagged during the export process, it's not surprising.
    Yvan KOENIG (VALLAURIS, France) mardi 3 janvier 2012
    iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
    My iDisk is : http://public.me.com/koenigyvan
    Please : Search for questions similar to your own before submitting them to the community
    For iWork's applications dedicated to iOS, go to :
    https://discussions.apple.com/community/app_store/iwork_for_ios

  • Problem with date format when ask prompt web-intelligence

    Bo XIR2 with 5 SP. Instaled on Windows 2003 with support Russian.
    Inside BO every labels, buttons - use russian. But when invoke web-report and Prompt appear there is problem with date format.
    Looks like korean format of date 'jj.nn.aaa H:mm:ss'.  I see system settings of date in Win .. everything right
    What i have to do?
    Where i can change format date for bo?

    GK, try this...
    decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
                                                                      to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn

  • Problem with Date format

    Got one more problem Merilyn and Radhakrishnan...
    Regarding the soln y provided me earler with the thread "Problem with date format"...
    What is happening is....I am able to change the 2400 to 0000 but when it is changed from 2400 on jan 1st to 0000 the hour is changing but not the date....the date still remains as jan 1st instead of jan 2nd....
    Eg: Jan 1st 2400 -- changed to -- jan1st 0000
    instead of jan 2nd 0000
    Could you please help me in this issue...
    Thanks,
    GK

    GK, try this...
    decode(instr(packagename.functionname(param1 ,param2),'2400'), 0, to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" hh24mi'),'mm/dd/yyyy hh24mi'),'mm/dd/yyyy hh24mi'),
                                                                      to_date(to_char(to_date(rtrim(packagename.functionname(param1 ,param2),'(PT)'), 'Month dd, yyyy "at" "2400"')+1,'mm/dd/yyyy "0000"'),'mm/dd/yyyy "0000"'))-Marilyn

  • Problem with date calculation

    I am a rookie in SAP i have a small problem with date.Do we have any function module to find out the first day in a month if we give out the system current date ?? Pls help me out.

    Hi,
    As Ganesan told,you can do.
    Here is the sample code.
    data v type sy-datum.
    data d type DTRESR-WEEKDAY.
    v+6(2) = '01'.
    v4(2) = sy-datum4(2).
    v0(4) = sy-datum0(4).
    CALL FUNCTION 'DATE_TO_DAY'
      EXPORTING
        date          = v
    IMPORTING
       WEEKDAY       = d.
    write d.

  • Problem with date format from Oracle DB

    Hi,
    I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
    I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
    I have changing the field settings in DataSource  to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
    I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
    If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
    Thanks in advance.
    Regards
    Varada

    Thanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
    The issue again in detail
    I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
    The problem is,  I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
    I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
    I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
    Regards
    Varada

  • Problem with date format dd/mm/yyyy. But I need to convert yyyy-mm-dd.

    Dear friends,
    I have the problem with date format. I receiving the date with the format dd/mm/yyyy. But I can upload to MySQL only in the format of yyyy-mm-dd.
    how should I handle this situation, for this I've created these code lines.But I have some problem with these line. please help me to solve this problem.
    String pattern = "yyyy-mm-dd";
    SimpleDateFormat format = new SimpleDateFormat(pattern);
    try {
    Date date = format.parse("2006-02-12");
    System.out.println(date);
    } catch (ParseException e) {
    e.printStackTrace();
    System.out.println(format.format(new Date()));
    this out put gives me Tue Apr 03 00:00:00 IST 2007
    But I need the date format in yyyy-mm-dd.
    regards,
    maza
    thanks in advance.

    Thanks Dear BalusC,
    I tried with this,
    rs.getString("DATA_SCAD1")// where the source from .xls files
    String pattern = "yyyy-MM-dd";
    SimpleDateFormat format = new SimpleDateFormat(pattern);
    try {
    Date date = format.parse("DATA_SCAD1");
    System.out.println(date);
    } catch (ParseException e) {
    e.printStackTrace();
    System.out.println(format.format(new Date()));
    this out put gives me Tue Apr 03 00:00:00 IST 2007
    But I want to display the date format in yyyy-mm-dd.
    regards,
    maza

  • Tp ended with error code 0247 - addtobuffer has problems with data- and/or

    Hello Experts,
    If you give some idea, it will be greatly appreciated.
    This transported issue started coming after power outage, sap system went hard shutdown.
    Then we brought up the system. Before that , we do not have this transport issue.
    our TMS landscape is
    DEV QA-PRD
    SED-SEQSEP
    DEV is having the TMS domain controller.
    FYI:
    *At OS level, when we do scp command using root user, it is fine for any TR.
    In STMS, while adding TR in SEQ(QA system), we  are getting error like this.
    Error:
    Transport control program tp ended with error code 0247
         Message no. XT200
    Diagnosis
         An error occurred when executing a tp command.
           Command:        ADDTOBUFFER SEDK906339 SEQ client010 pf=/us
           Return code:    0247
           Error text:     addtobuffer has problems with data- and/or
           Request:        SEDK906339
    System Response
         The function terminates.
    Procedure
         Correct the error and execute the command again if necessary.
    This is tp version 372.04.71 (release 700, unicode enabled)
    Addtobuffer failed for SEDK906339.
      Neither datafile nor cofile exist (cofile may also be corrupted).
    standard output from tp and from tools called by tp:
    tp returncode summary:
    TOOLS: Highest return code of single steps was: 0
    ERRORS: Highest tp internal error was: 0247

    when we do scp using sm69,
    SEDADM@DEVSYS:/usr/sap/trans/cofiles/K906339.SED SEQADM@QASYS:/usr/sap/trans/cofiles/.
    it throws the error like below,
    Host key verification failed.
                                                                                    External program terminated with exit code 1
    Thanks
    Praba

  • I have one problem with Data Guard. My archive log files are not applied.

    I have one problem with Data Guard. My archive log files are not applied. However I have received all archive log files to my physical Standby db
    I have created a Physical Standby database on Oracle 10gR2 (Windows XP professional). Primary database is on another computer.
    In Enterprise Manager on Primary database it looks ok. I get the following message “Data Guard status Normal”
    But as I wrote above ”the archive log files are not applied”
    After I created the Physical Standby database, I have also done:
    1. I connected to the Physical Standby database instance.
    CONNECT SYS/SYS@luda AS SYSDBA
    2. I started the Oracle instance at the Physical Standby database without mounting the database.
    STARTUP NOMOUNT PFILE=C:\oracle\product\10.2.0\db_1\database\initluda.ora
    3. I mounted the Physical Standby database:
    ALTER DATABASE MOUNT STANDBY DATABASE
    4. I started redo apply on Physical Standby database
    alter database recover managed standby database disconnect from session
    5. I switched the log files on Physical Standby database
    alter system switch logfile
    6. I verified the redo data was received and archived on Physical Standby database
    select sequence#, first_time, next_time from v$archived_log order by sequence#
    SEQUENCE# FIRST_TIME NEXT_TIME
    3 2006-06-27 2006-06-27
    4 2006-06-27 2006-06-27
    5 2006-06-27 2006-06-27
    6 2006-06-27 2006-06-27
    7 2006-06-27 2006-06-27
    8 2006-06-27 2006-06-27
    7. I verified the archived redo log files were applied on Physical Standby database
    select sequence#,applied from v$archived_log;
    SEQUENCE# APP
    4 NO
    3 NO
    5 NO
    6 NO
    7 NO
    8 NO
    8. on Physical Standby database
    select * from v$archive_gap;
    No rows
    9. on Physical Standby database
    SELECT MESSAGE FROM V$DATAGUARD_STATUS;
    MESSAGE
    ARC0: Archival started
    ARC1: Archival started
    ARC2: Archival started
    ARC3: Archival started
    ARC4: Archival started
    ARC5: Archival started
    ARC6: Archival started
    ARC7: Archival started
    ARC8: Archival started
    ARC9: Archival started
    ARCa: Archival started
    ARCb: Archival started
    ARCc: Archival started
    ARCd: Archival started
    ARCe: Archival started
    ARCf: Archival started
    ARCg: Archival started
    ARCh: Archival started
    ARCi: Archival started
    ARCj: Archival started
    ARCk: Archival started
    ARCl: Archival started
    ARCm: Archival started
    ARCn: Archival started
    ARCo: Archival started
    ARCp: Archival started
    ARCq: Archival started
    ARCr: Archival started
    ARCs: Archival started
    ARCt: Archival started
    ARC0: Becoming the 'no FAL' ARCH
    ARC0: Becoming the 'no SRL' ARCH
    ARC1: Becoming the heartbeat ARCH
    Attempt to start background Managed Standby Recovery process
    MRP0: Background Managed Standby Recovery process started
    Managed Standby Recovery not using Real Time Apply
    MRP0: Background Media Recovery terminated with error 1110
    MRP0: Background Media Recovery process shutdown
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[1]: Assigned to RFS process 2148
    RFS[1]: Identified database type as 'physical standby'
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[2]: Assigned to RFS process 2384
    RFS[2]: Identified database type as 'physical standby'
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[3]: Assigned to RFS process 3188
    RFS[3]: Identified database type as 'physical standby'
    Primary database is in MAXIMUM PERFORMANCE mode
    Primary database is in MAXIMUM PERFORMANCE mode
    RFS[3]: No standby redo logfiles created
    Redo Shipping Client Connected as PUBLIC
    -- Connected User is Valid
    RFS[4]: Assigned to RFS process 3168
    RFS[4]: Identified database type as 'physical standby'
    RFS[4]: No standby redo logfiles created
    Primary database is in MAXIMUM PERFORMANCE mode
    RFS[3]: No standby redo logfiles created
    10. on Physical Standby database
    SELECT PROCESS, STATUS, THREAD#, SEQUENCE#, BLOCK#, BLOCKS FROM V$MANAGED_STANDBY;
    PROCESS STATUS THREAD# SEQUENCE# BLOCK# BLOCKS
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    ARCH CONNECTED 0 0 0 0
    RFS IDLE 0 0 0 0
    RFS IDLE 0 0 0 0
    RFS IDLE 1 9 13664 2
    RFS IDLE 0 0 0 0
    10) on Primary database:
    select message from v$dataguard_status;
    MESSAGE
    ARC0: Archival started
    ARC1: Archival started
    ARC2: Archival started
    ARC3: Archival started
    ARC4: Archival started
    ARC5: Archival started
    ARC6: Archival started
    ARC7: Archival started
    ARC8: Archival started
    ARC9: Archival started
    ARCa: Archival started
    ARCb: Archival started
    ARCc: Archival started
    ARCd: Archival started
    ARCe: Archival started
    ARCf: Archival started
    ARCg: Archival started
    ARCh: Archival started
    ARCi: Archival started
    ARCj: Archival started
    ARCk: Archival started
    ARCl: Archival started
    ARCm: Archival started
    ARCn: Archival started
    ARCo: Archival started
    ARCp: Archival started
    ARCq: Archival started
    ARCr: Archival started
    ARCs: Archival started
    ARCt: Archival started
    ARCm: Becoming the 'no FAL' ARCH
    ARCm: Becoming the 'no SRL' ARCH
    ARCd: Becoming the heartbeat ARCH
    Error 1034 received logging on to the standby
    Error 1034 received logging on to the standby
    LGWR: Error 1034 creating archivelog file 'luda'
    LNS: Failed to archive log 3 thread 1 sequence 7 (1034)
    FAL[server, ARCh]: Error 1034 creating remote archivelog file 'luda'
    11)on primary db
    select name,sequence#,applied from v$archived_log;
    NAME SEQUENCE# APP
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00003_0594204176.001 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00004_0594204176.001 4 NO
    Luda 4 NO
    Luda 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00005_0594204176.001 5 NO
    Luda 5 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00006_0594204176.001 6 NO
    Luda 6 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00007_0594204176.001 7 NO
    Luda 7 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\IRINA\ARC00008_0594204176.001 8 NO
    Luda 8 NO
    12) on standby db
    select name,sequence#,applied from v$archived_log;
    NAME SEQUENCE# APP
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00004_0594204176.001 4 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00003_0594204176.001 3 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00005_0594204176.001 5 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00006_0594204176.001 6 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00007_0594204176.001 7 NO
    C:\ORACLE\PRODUCT\10.2.0\ORADATA\LUDA\ARC00008_0594204176.001 8 NO
    13) my init.ora files
    On standby db
    irina.__db_cache_size=79691776
    irina.__java_pool_size=4194304
    irina.__large_pool_size=4194304
    irina.__shared_pool_size=75497472
    irina.__streams_pool_size=0
    *.audit_file_dest='C:\oracle\product\10.2.0\admin\luda\adump'
    *.background_dump_dest='C:\oracle\product\10.2.0\admin\luda\bdump'
    *.compatible='10.2.0.1.0'
    *.control_files='C:\oracle\product\10.2.0\oradata\luda\luda.ctl'
    *.core_dump_dest='C:\oracle\product\10.2.0\admin\luda\cdump'
    *.db_block_size=8192
    *.db_domain=''
    *.db_file_multiblock_read_count=16
    *.db_file_name_convert='luda','irina'
    *.db_name='irina'
    *.db_unique_name='luda'
    *.db_recovery_file_dest='C:\oracle\product\10.2.0\flash_recovery_area'
    *.db_recovery_file_dest_size=2147483648
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
    *.fal_client='luda'
    *.fal_server='irina'
    *.job_queue_processes=10
    *.log_archive_config='DG_CONFIG=(irina,luda)'
    *.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/luda/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=luda'
    *.log_archive_dest_2='SERVICE=irina LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=irina'
    *.log_archive_dest_state_1='ENABLE'
    *.log_archive_dest_state_2='ENABLE'
    *.log_archive_max_processes=30
    *.log_file_name_convert='C:/oracle/product/10.2.0/oradata/irina/','C:/oracle/product/10.2.0/oradata/luda/'
    *.open_cursors=300
    *.pga_aggregate_target=16777216
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.sga_target=167772160
    *.standby_file_management='AUTO'
    *.undo_management='AUTO'
    *.undo_tablespace='UNDOTBS1'
    *.user_dump_dest='C:\oracle\product\10.2.0\admin\luda\udump'
    On primary db
    irina.__db_cache_size=79691776
    irina.__java_pool_size=4194304
    irina.__large_pool_size=4194304
    irina.__shared_pool_size=75497472
    irina.__streams_pool_size=0
    *.audit_file_dest='C:\oracle\product\10.2.0/admin/irina/adump'
    *.background_dump_dest='C:\oracle\product\10.2.0/admin/irina/bdump'
    *.compatible='10.2.0.1.0'
    *.control_files='C:\oracle\product\10.2.0\oradata\irina\control01.ctl','C:\oracle\product\10.2.0\oradata\irina\control02.ctl','C:\oracle\product\10.2.0\oradata\irina\control03.ctl'
    *.core_dump_dest='C:\oracle\product\10.2.0/admin/irina/cdump'
    *.db_block_size=8192
    *.db_domain=''
    *.db_file_multiblock_read_count=16
    *.db_file_name_convert='luda','irina'
    *.db_name='irina'
    *.db_recovery_file_dest='C:\oracle\product\10.2.0/flash_recovery_area'
    *.db_recovery_file_dest_size=2147483648
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=irinaXDB)'
    *.fal_client='irina'
    *.fal_server='luda'
    *.job_queue_processes=10
    *.log_archive_config='DG_CONFIG=(irina,luda)'
    *.log_archive_dest_1='LOCATION=C:/oracle/product/10.2.0/oradata/irina/ VALID_FOR=(ALL_LOGFILES, ALL_ROLES) DB_UNIQUE_NAME=irina'
    *.log_archive_dest_2='SERVICE=luda LGWR ASYNC VALID_FOR=(ONLINE_LOGFILES, PRIMARY_ROLE) DB_UNIQUE_NAME=luda'
    *.log_archive_dest_state_1='ENABLE'
    *.log_archive_dest_state_2='ENABLE'
    *.log_archive_max_processes=30
    *.log_file_name_convert='C:/oracle/product/10.2.0/oradata/luda/','C:/oracle/product/10.2.0/oradata/irina/'
    *.open_cursors=300
    *.pga_aggregate_target=16777216
    *.processes=150
    *.remote_login_passwordfile='EXCLUSIVE'
    *.sga_target=167772160
    *.standby_file_management='AUTO'
    *.undo_management='AUTO'
    *.undo_tablespace='UNDOTBS1'
    *.user_dump_dest='C:\oracle\product\10.2.0/admin/irina/udump'
    Please help me!!!!

    Hi,
    After several tries my redo logs are applied now. I think in my case it had to do with the tnsnames.ora. At this moment I have both database in both tnsnames.ora files using the SID and not the SERVICE_NAME.
    Now I want to use DGMGRL. Adding a configuration and a stand-by database is working fine, but when I try to enable the configuration DGMGRL gives no feedback and it looks like it is hanging. The log, although says that it succeeded.
    In another session 'show configuration' results in the following, confirming that the enable succeeded.
    DGMGRL> show configuration
    Configuration
    Name: avhtest
    Enabled: YES
    Protection Mode: MaxPerformance
    Fast-Start Failover: DISABLED
    Databases:
    avhtest - Primary database
    avhtestls53 - Physical standby database
    Current status for "avhtest":
    Warning: ORA-16610: command 'ENABLE CONFIGURATION' in progress
    It there anybody that experienced the same problem and/or knows the solution to this?
    With kind regards,
    Martin Schaap

  • A problem with data acquisitioning in LV 7.1 (transition from Traditional NI-DAQ to NI-DAQmx)

    Hi to everyone,
    I have a problem with data acquisitioning in LV 7.1.
    I made a transition from Tradiotional NI-DAQ to NI-DAQmx in my LabVIEW application.
    The problem I have is that when I acquire data in Traditional (without writing somewhere -
    just reading) then there is no scan backlog data. But when I acquire data in application that
    acquisition is based on DAQmx than a scan backlog indicator shows numbers from 20 to 50 for
    about 6 min and then that number quite quickly increases until I get an error (unable to
    acquire data. The data was overwritten).
    Acquisition settings are the same in both cases. When I acquire with DAQmx I use a global
    channels. Is a reason for that phenomenon in global channels data procesing? But it seems
    strange why does it flows quite smoothly for about 6 min and then it stucks?
    Best regards,
    Ero

    If you have an old Daq unit it may not be compatible with DAQMX. Which DAQ unit do you have? I think NI have a list showing which DAQ driver you can use with your card
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Has anyone else had a problem with data? after updating my phone to the new update, i have all this data, and keep getting notifications that my data is getting high. i never had this problem before.

    has anyone else had a problem with data? after updating my phone to the new update, i have all this data, and keep getting notifications that my data is getting high. i never had this problem before.

    fair enough.  No need for any unnecessary posts either.  You issue had been addressed ad nauseum already in this forum had you bothered to search this forum (as forum etiquette would dictate) before posting.
    In any case, I hope that your problem was solved.

  • I have problem with data transfer between Windows Server 2012RT and Windows7 (no more than 14kbps) while between Windows Sever 2012RT and Windows8.1 speed is ok.

    I have problem with data transfer between Windows Server 2012RT and Windows7 (no more than 14kbps) while between Windows Sever 2012RT and Windows8.1 speed is ok.

    Hi,
    Regarding the issue here, please take a look at the below links to see if they could help:
    Slow data transfer speed in Windows 7 or in Windows Server 2008 R2
    And a blog here:
    Windows Server 2012 slow network/SMB/CIFS problem
    Hope this may help
    Best regards
    Michael
    If you have any feedback on our support, please click
    here.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • URGENT ! JDEV 10.1.2 Problem with data control generated from session bean

    I got a problem with data control generated from session bean which return a collection of data transfer object.
    The dto's seem to be correct. The session bean load correctly the data into and the object's are plenty of data. Using the console to display the dto content is ok.
    When generating a data control from this session bean and associate the dto included in the collection only the first object level and one-to-one dto object are correctly setted in the data control. Object that represent collection into the dto (one-to-many foreign key) are setted as collection with an iterator but the structure of the object is not setted. I don't know how to associate this second level of collection with the dto bean class to obtain the attributes definition.
    I created a case with hr schema like the hrApp demo application in the tutorial with departments and employees table. I got the same problem.
    Is it a bug ?
    It exists a workaround to force the data control to understand the collection data structure ?
    Help is welcome ! this is urgent !!!

    we found the problem by assigning the child dto bean class to the node representing the iterator in the xml file corresponding to the master dto.

Maybe you are looking for