Assign Hotkeys with Oracle Form6i Triggers

i have one problem in form6i.
How to assign assign Hotkeys with Oracle Forms Triggers?
hot to assign to assign F12 key with form trigger and what is the process?
can you explain brifly?

Hi User,
Kindly follow the following link where I have summarized the usage of Oracle terminal.
[Usage of Oracle terminal|http://gskaushik.blogspot.com/2009/05/oracle-terminal-reference-forms-6i.html|Click to follow link]
Kindly revert for any clarifications
regards
Kaushik
Kindly click Helpful if it helps or correct if the issue is solved

Similar Messages

  • Editable views with instead of triggers in Oracle 10G

    Hi,
    We are attempting to migrate our database application from Oracle 9i to Oracle 10G. We have found that we are getting the error [ORA-03113 end-of-file on communication channel] when we attempt to insert multiple rows into database tables using editable views equipped with INSTEAD OF triggers. When inserting multiple rows, we are using a single insert statement of the form insert into...select from. Strangely enough, when the select query within the insert statement returns a single row, the insert statement is successful. We also noticed that if we issue a commit right before executing the insert statement, the insert statement is successful regardless of the number of rows being inserted. But this is not a viable workaround as the current transaction contents needs to be maintained. We did not have these issues in Oracle 9i. Does Oracle 10G have any known issues with editable views equipped with INSTEAD OF triggers? Are there any settings in the INIT.ORA file that can resolve these issues? Thanks for your help.

    Confirmed. If you use 2D input, or change the Z dimensions so they are different, you will get the results you expect in 10g. Otherwise - a line.
    I'm not sure why, and can't see a bug report on that in support, but obviously it was recognized and fixed as you indicate.
    Try it with 2D input, and it will work:
    select sdo_aggr_mbr(
    SDO_GEOMETRY(2003, NULL, NULL, SDO_ELEM_INFO_ARRAY(1, 1003, 3),
    SDO_ORDINATE_ARRAY(-10, -10, 10, 10))
    from dual;
    Or with different Z values:
    select sdo_aggr_mbr(
    SDO_GEOMETRY(3003, NULL, NULL, SDO_ELEM_INFO_ARRAY(1, 1003, 3),
    SDO_ORDINATE_ARRAY(-10, -10, 1, 10, 10, 11))
    from dual

  • Problem with Roles and Triggers

    I'm having a strange problem with Roles and Triggers in Oracle. It's a little difficult to describe, so bear with me...
    I'm trying to create a trigger that inserts records into a table belonging to a different user/owner. Of course, the owner of this trigger needs rights to insert records into this other table. I find that if I add these rights directly to the owner of the trigger, everything works okay and the trigger compiles successfully.
    However, if I first create a Role and grant the "insert" rights to it, and then assign this role to the owner of the trigger, the trigger does not compile successfully.
    To illustrate this, here's an example script. I'm using Oracle 10g Release 2...
    -- Clean up...
    DROP TABLE TestUser.TrigTable;
    DROP TABLE TestUser2.TestTable;
    DROP ROLE TestRole;
    DROP TRIGGER TestUser.TestTrigger;
    DROP USER TestUser CASCADE;
    DROP USER TestUser2 CASCADE;
    -- Create Users...
    CREATE USER TestUser IDENTIFIED BY password DEFAULT TABLESPACE "USERS" TEMPORARY TABLESPACE "TEMP" QUOTA UNLIMITED ON "USERS";
    CREATE USER TestUser2 IDENTIFIED BY password DEFAULT TABLESPACE "USERS" TEMPORARY TABLESPACE "TEMP" QUOTA UNLIMITED ON "USERS";
    CREATE TABLE TestUser.TrigTable (TestColumn VARCHAR2(40));
    CREATE TABLE TestUser2.TestTable (TestColumn VARCHAR2(40));
    -- Grant Insert rights on TestTable to TestRole...
    CREATE ROLE TestRole NOT IDENTIFIED;
    GRANT INSERT ON TestUser2.TestTable TO TestRole;
    -- Add TestRole to TestUser. TestUser should now have rights to INSERT on TestTable
    GRANT TestRole TO TestUser;
    ALTER USER TestUser DEFAULT ROLE ALL;
    -- Now, create the trigger. This compiles unsuccessfully...
    CREATE TRIGGER TestUser.TestTrigger AFTER INSERT ON TestUser.TrigTable
    BEGIN
    INSERT INTO TestUser2.TestTable (TestColumn) VALUES ('Test');
    END;
    When I do a "SHOW ERRORS;" after this, I get:
    SQL> show errors;
    Errors for TRIGGER TESTUSER.TESTTRIGGER:
    LINE/COL ERROR
    2/3 PL/SQL: SQL Statement ignored
    2/25 PL/SQL: ORA-00942: table or view does not exist
    SQL>
    As I said above, if I just add the Insert rights directly to TestUser, the trigger compiles perfectly. Does anyone know why this is happening?
    Thanks!
    Adrian

    Hi Raghu,
    If the insert rights exist only on TestRole, and TestRole is assigned to TestUser, I can do the INSERT statement you suggest with no problems if I just execute it from SQLPlus (logged in as TestUser).
    The question is, why does the same INSERT fail when it's inside the trigger?

  • Mapping LDAP Role in Building Your First Process with Oracle BPM 11g

    I'm working on "Building Your First Process with Oracle BPM 11g" I'm at the end of step where assigns user for the requester. The problem is in identity lookup, "Realm" is empty for Remote_WLServer.
    Servers are up and running. Demo user community has been loaded - I can see the list of users and groups in the administration server under myrealm. We haven't done much since SOA suite 11g installation. I'm probably the first one who uses this. I wonder we have a missing set up? Can you me what's missing? Appreciate your help in advance.

    I get this error message when I clicked gear icon.
    "Server exception is : Connection refused from server"
    Here is the result of testing Remove_WLServer connection. Does this cause the issue?
    Testing JSR-160 Runtime ... failed.
    Cannot establish connection.
    Testing JSR-160 DomainRuntime ... skipped.
    Testing JSR-88 ... skipped.
    Testing JSR-88-LOCAL ... skipped.
    Testing JNDI ... skipped.
    Testing JSR-160 Edit ... skipped.
    Testing HTTP ... success.
    Testing Server MBeans Model ... skipped.
    Testing HTTP Authentication ... success.
    2 of 9 tests successful.
    I have installed JDeveloper 9i, 10g, and 11g in my laptop. SOA is installed on linux.

  • IMP-00017: failed with ORACLE error 942 (SYS.DBMS_REPUTIL.SYNC_UP_REP)

    Hi,
    I am trying to exp a table from an oracle 9i db to an oracle 11g db.
    exp USERID=user1/pass1@db9i TABLES=test FILE=test.dmp ROWS="y" statistics="none" GRANTS="n" INDEXES="n" TRIGGERS="n" CONSTRAINTS="n"
    While trying to import i get an error
    *"IMP-00017: following statement failed with ORACLE error 942:*
    *"BEGIN SYS.DBMS_REPUTIL.SYNC_UP_REP('CYGENT_ADMIN','AGENT_ROOT_MAP'); END;"*
    _when i do_
    imp USERID=user2/pass2@db11g TABLES=test FILE=test.dmp FROMUSER=user1 TOUSER=user2 SHOW=Y
    the last line i see is
    *"BEGIN SYS.DBMS_REPUTIL.SYNC_UP_REP('CYGENT_ADMIN','AGENT_ROOT_MAP'); END;"*
    _Question_
    What do i need to do so that the last line (SYNC_UP_REP) is not included in the export dump.
    Have tried searching for this but without much luck.
    Appreciate the help.

    Explain your usage of Advanced Replication capabilities. Where, what, and why.

  • Urgent:IMP-00017: following statement failed with ORACLE error 3113

    OS:HP_UX 11i
    Oracle 9.2.0.3.0
    when import tablespace metadata,
    comand is "imp \'sys/sysadmin AS sysdba\' TRANSPORT_TABLESPACE=y DATAFILES='/oradata/data2/
    B.dbf' TABLESPACES=A FILE=A.dmp IGNORE=Y LOG=./log/import.log"
    the error occurs as followed:
    Export file created by EXPORT:V09.02.00 via conventional path
    About to import transportable tablespace(s) metadata...
    import done in US7ASCII character set and AL16UTF16 NCHAR character set
    import server uses WE8ISO8859P1 character set (possible charset conversion)
    . importing SYS's objects into SYS
    IMP-00017: following statement failed with ORACLE error 3113:
    "BEGIN sys.dbms_plugts.beginImport ('9.2.0.3.0',31,'2000',NULL,'NULL',3626"
    "7,35051,1); END;"
    IMP-00003: ORACLE error 3113 encountered
    ORA-03113: end-of-file on communication channel
    IMP-00000: Import terminated unsuccessfully
    export is in same enviorment, the successful informantion below:
    exp \'sys/sysadmin AS sysdba\' TRANSPORT_TABLESPACE=y TABLESPACES=A FILE=A.dmp
    LOG=./log/export.log
    log file:
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    Note: table data (rows) will not be exported
    About to export transportable tablespace metadata...
    For tablespace A ...
    . exporting cluster definitions
    . exporting table definitions
    . . exporting table A
    . exporting referential integrity constraints
    . exporting triggers
    . end transportable tablespace metadata export
    Export terminated successfully without warnings.

    I do it,but the same error message.
    export message:
    Connected to: Oracle9i Enterprise Edition Release 9.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.3.0 - Production
    Export done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
    Note: table data (rows) will not be exported
    About to export transportable tablespace metadata...
    For tablespace A ...
    . exporting cluster definitions
    . exporting table definitions
    . . exporting table A
    . exporting referential integrity constraints
    . exporting triggers
    . end transportable tablespace metadata export
    Export terminated successfully without warnings.
    Import Message:
    Export file created by EXPORT:V09.02.00 via conventional path
    About to import transportable tablespace(s) metadata...
    import done in WE8ISO8859P1 character set and AL16UTF16 NCHAR character set
    . importing SYS's objects into SYS
    IMP-00017: following statement failed with ORACLE error 3113:
    "BEGIN sys.dbms_plugts.beginImport ('9.2.0.3.0',31,'2000',NULL,'NULL',3627"
    "7,36451,1); END;"
    IMP-00003: ORACLE error 3113 encountered
    ORA-03113: end-of-file on communication channel
    IMP-00000: Import terminated unsuccessfully

  • E-Business Suite Integration with Oracle Business Intelligence

    Hi,
    To integrate E-Business Suite with Oracle Business Intelligence I have done all the configurations as specified in DocId. *552735.1* of Metalink. I have tested the configurations by logging in from EBS and navigating to OBIEE Presentation Services. I'm able to navigate with no errors but not able to view Subject Areas in the Answers link. I'm testing this with 'Administrator' user. When I directly login to Presentation Serivices with the Administrator user I'm able to see the Subject Areas.
    Can anyone have idea what is the problem?
    Thanks in Advance,

    Hi,
    Think I spotted the problem - probably specific to OBIA 7.9.5.
    There is no "Authorization" initialization block: in previous incarnations (e.g. OBIA 7.9.4 and earlier I guess) there was a FndGetResp intialization block that assigned a value to the OBIEE "GROUP" system variable.
    So I did the following and now I'm getting Subject Areas and Answers answers, but Dashboards still not working for me - maybe another issue.
    1. Create system variable GROUP, temporarily with initialization block "Authentication"
    2. Create initialization block "EBS Responsibility to Group" with query (note I've modified based on new session variable name RESP_ID goes to OLTP_EBS_RESP_ID in 7.9.5):
    SELECT RESPONSIBILITY_KEY
    FROM FND_RESPONSIBILITY
    WHERE RESPONSIBILITY_ID = 'valueof(NQ_SESSION.OLTP_EBS_RESP_ID)'
    Set the execution precendence to include EBS Security Context and EBS Single Sign-On Integration
    NB: Had to create dummy variable GROUP_TMP for the "OK" button to come up, will replace that with real GROUP in next steps
    3. Go back and edit System variable GROUP and change initialization block to EBS Responsibility to Group
    4. Go back and edit initialization block EBS Responsibility to Group and remove variable GROUP_TMP
    5. Created an OBIEE Group with name = responsibility key e.g. for my "SBA Administrator" responsibility, key = "SBA_ADMIN_KEY", so I created group named "SBA_ADMIN_KEY"
    6. Made your new group (e.g. SBA_ADMIN_KEY) a member of the Administrators Group
    7. Copy across rpd and restart sa / saw
    Still awaiting response from Oracle Support ... so no guarantees this is correct and no guarantees that it won't break something else that Oracle suggests later!
    Regards,
    Gareth

  • Transaction timed out with oracle 9i

    Hi,
    I am working with weblogic6.1.
    Sometimes (very seldom) I have a problem which I don't undserstand.
    The transaction stops and and after some time I get TimedOutException.
    This time it happend in the findByPrimaryKey of a CMP Entity bean.
    I have this problem only with Oracle 9i. with oracle8i everything works
    fine.
    I'm using oracle thin driver (I put oracle12.zip at the begin of weblogic
    classpath)
    Is this problem of oracle?
    Does WLS 6.1 support Oracle9i?
    Thanks for any hints
    Szymon
    stack trace:
    javax.ejb.FinderException: Problem in findByPrimaryKey while preparing or
    execut
    ing statement: 'weblogic.jdbc.rmi.SerialPreparedStatement@197155':
    java.sql.SQLException: The transaction is no longer active (status = Marked
    roll
    back. [Reason=weblogic.transaction.internal.TimedOutException: Transaction
    timed
    out after 561 seconds
    Xid=21550:aea95ccd7f28edb9(4655671),Status=Active,numRepliesOwedMe=0,numRepl
    iesO
    wedOthers=0,seconds since begin=561,seconds
    left=30,activeThread=Thread[ExecuteT
    hread: '14' for queue: 'default',5,Thread Group for Queue:
    'default'],ServerReso
    urceInfo[weblogic.jdbc.jts.Connection]=(state=started,assigned=none),SCInfo[
    mydo
    main+myserver]=(state=active),properties=({weblogic.jdbc=t3://172.16.0.28:70
    01})
    ,OwnerTransactionManager=ServerTM[ServerCoordinatorDescriptor=(CoordinatorUR
    L=my
    server+172.16.0.28:7001+mydomain+,
    Resources={})],CoordinatorURL=myserver+172.16
    .0.28:7001+mydomain+)]). No further JDBC access is allowed within this
    transact
    ion.
    java.sql.SQLException: The transaction is no longer active (status = Marked
    roll
    back. [Reason=weblogic.transaction.internal.TimedOutException: Transaction
    timed
    out after 561 seconds
    Xid=21550:aea95ccd7f28edb9(4655671),Status=Active,numRepliesOwedMe=0,numRepl
    iesO
    wedOthers=0,seconds since begin=561,seconds
    left=30,activeThread=Thread[ExecuteT
    hread: '14' for queue: 'default',5,Thread Group for Queue:
    'default'],ServerReso
    urceInfo[weblogic.jdbc.jts.Connection]=(state=started,assigned=none),SCInfo[
    mydo
    main+myserver]=(state=active),properties=({weblogic.jdbc=t3://172.16.0.28:70
    01})
    ,OwnerTransactionManager=ServerTM[ServerCoordinatorDescriptor=(CoordinatorUR
    L=my
    server+172.16.0.28:7001+mydomain+,
    Resources={})],CoordinatorURL=myserver+172.16
    .0.28:7001+mydomain+)]). No further JDBC access is allowed within this
    transact
    ion.
    at
    weblogic.jdbc.jts.Connection.checkIfRolledBack(Connection.java:498)
    at weblogic.jdbc.jts.Statement.setInt(Statement.java:606)
    at
    weblogic.jdbc.rmi.internal.PreparedStatementImpl.setInt(PreparedState
    mentImpl.java:104)
    at
    weblogic.jdbc.rmi.SerialPreparedStatement.setInt(SerialPreparedStatem
    ent.java:137)
    at
    com.verdisoft.datasource.ejb.contact.Person_vjvtzf__WebLogic_CMP_RDBM
    S.ejbFindByPrimaryKey(Person_vjvtzf__WebLogic_CMP_RDBMS.java:1531)
    at java.lang.reflect.Method.invoke(Native Method)
    at
    weblogic.ejb20.cmp.rdbms.RDBMSPersistenceManager.findByPrimaryKey(RDB
    MSPersistenceManager.java:171)
    at
    weblogic.ejb20.manager.BaseEntityManager.findByPrimaryKey(BaseEntityM
    anager.java:435)
    at
    weblogic.ejb20.manager.BaseEntityManager.localFindByPrimaryKey(BaseEn
    tityManager.java:389)
    at
    weblogic.ejb20.internal.EntityEJBLocalHome.findByPrimaryKey(EntityEJB
    LocalHome.java:266)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean_vjvtzf_LocalHomeImpl.
    findByPrimaryKey(PersonBean_vjvtzf_LocalHomeImpl.java:144)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean.ejbHomeGetByKey(Perso
    nBean.java:504)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean_vjvtzf_LocalHomeImpl.
    getByKey(PersonBean_vjvtzf_LocalHomeImpl.java:297)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.getBeanAdapter(EJBAdapter
    Bean.java:872)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItemInternal(EJBAda
    pterBean.java:834)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItem(EJBAdapterBean
    .java:808)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean_s83q9a_EOImpl.updateItem(
    EJBAdapterBean_s83q9a_EOImpl.java:146)
    at
    com.verdisoft.datasource.DataAdapterManagerBean.updateItem(DataAdapte
    rManagerBean.java:144)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl.updateI
    tem(DataAdapterManagerBean_plajw8_EOImpl.java:614)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl_WLSkel.
    invoke(Unknown Source)
    at
    weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:282)
    at
    weblogic.rmi.cluster.ReplicaAwareServerRef.invoke(ReplicaAwareServerR
    ef.java:97)
    at
    weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.jav
    a:231)
    at
    weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest
    .java:21)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:144)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)
    at
    com.verdisoft.datasource.ejb.contact.Person_vjvtzf__WebLogic_CMP_RDBM
    S.ejbFindByPrimaryKey(Person_vjvtzf__WebLogic_CMP_RDBMS.java:1715)
    at java.lang.reflect.Method.invoke(Native Method)
    at
    weblogic.ejb20.cmp.rdbms.RDBMSPersistenceManager.findByPrimaryKey(RDB
    MSPersistenceManager.java:171)
    at
    weblogic.ejb20.manager.BaseEntityManager.findByPrimaryKey(BaseEntityM
    anager.java:435)
    at
    weblogic.ejb20.manager.BaseEntityManager.localFindByPrimaryKey(BaseEn
    tityManager.java:389)
    at
    weblogic.ejb20.internal.EntityEJBLocalHome.findByPrimaryKey(EntityEJB
    LocalHome.java:266)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean_vjvtzf_LocalHomeImpl.
    findByPrimaryKey(PersonBean_vjvtzf_LocalHomeImpl.java:144)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean.ejbHomeGetByKey(Perso
    nBean.java:504)
    at
    com.verdisoft.datasource.ejb.contact.PersonBean_vjvtzf_LocalHomeImpl.
    getByKey(PersonBean_vjvtzf_LocalHomeImpl.java:297)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.getBeanAdapter(EJBAdapter
    Bean.java:872)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItemInternal(EJBAda
    pterBean.java:834)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItem(EJBAdapterBean
    .java:808)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean_s83q9a_EOImpl.updateItem(
    EJBAdapterBean_s83q9a_EOImpl.java:146)
    at
    com.verdisoft.datasource.DataAdapterManagerBean.updateItem(DataAdapte
    rManagerBean.java:144)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl.updateI
    tem(DataAdapterManagerBean_plajw8_EOImpl.java:614)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl_WLSkel.
    invoke(Unknown Source)
    at
    weblogic.rmi.internal.BasicServerRef.invoke(BasicServerRef.java:282)
    at
    weblogic.rmi.cluster.ReplicaAwareServerRef.invoke(ReplicaAwareServerR
    ef.java:97)
    at
    weblogic.rmi.internal.BasicServerRef.handleRequest(BasicServerRef.jav
    a:231)
    at
    weblogic.rmi.internal.BasicExecuteRequest.execute(BasicExecuteRequest
    .java:21)
    at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:144)
    at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)
    com.verdisoft.datasource.exception.DataNotFoundException: Object not found:
    data
    source://private_addressbook/ejbcontact/john1025100151518/12770
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.getBeanAdapter(EJBAdapter
    Bean.java:876)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItemInternal(EJBAda
    pterBean.java:834)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean.updateItem(EJBAdapterBean
    .java:808)
    at
    com.verdisoft.datasource.ejb.EJBAdapterBean_s83q9a_EOImpl.updateItem(
    EJBAdapterBean_s83q9a_EOImpl.java:146)
    at
    com.verdisoft.datasource.DataAdapterManagerBean.updateItem(DataAdapte
    rManagerBean.java:144)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl.updateI
    tem(DataAdapterManagerBean_plajw8_EOImpl.java:614)
    at
    com.verdisoft.datasource.DataAdapterManagerBean_plajw8_EOImpl_WLSkel.
    invoke(Unknown Source)

    Irene Ho wrote:
    Dear all,
    When the java application program tries to insert the data (around 300
    records). Sometimes, the error occurs as "EJB Exception:
    weblogic.transaction.internal.TimedOutException: Transaction timed out after
    95 seconds".
    When the error occurred, only one java application was running. Furthermore,
    I set the JTA-->Timeout Seconds to 30 in the thr the Weblogic console. The
    configuration of the server is WebLogic 7.0 with SP2 and Oracle 8.1.7.
    Anyone know what happen and how to resolve the problem? The error is due to
    the entity bean, Weblogic or Oracle?
    Thanks a lot.
    IreneOr configuration. You can set your timeout to a different value. It is likely
    to be an oracle problem. We don't do anything unnecessary to delay your application
    code. You may be able to check the jdbc log or jta log to see when the tx starts and
    how long it takes to progress. If you note a delay, you could take a server
    thread dump to see what weblogic is doing. Typically it will be waiting for
    Oracle to respond from a jdbc call.
    Joe

  • Regarding OBIEE Integration With Oracle EBS

    Hi All,
    I have installed BIAPPS (7.9.6.1) (components installed are 2 11g DBs (one is OLTP and other is OLAP) on RHEL 4, Oracle Application Server 10g, Informatica Power Center , DAC and OBIEE). After that i need to integrate the OBIEE with Oracle EBS. I have installed 11.5.10.2. I upgraded the database to 10g R2 and applied patch ATG 6 (Pre reqs of Integration).
    Steps Which i have done for integration:-
    I installed Financial module in Biapps. Made changes to that pre built rpd ie:OracleBIAnalyticsApps.rpd using Administrations tool. I have created datasource for EBS and given in connection pool. Similarly in dataware house connection pool also. When i test from Administration tool means Init Block > EBS Security context > Edit datasource. When i click on test the datasource i am getting error NQ_SESSION.ICX_SESSION_COOKIE* has no value definition.*
    I copied that rpd linux box where my OBIEE server resides and changed the rpd name in NQSConfig.INI. I added the oracle_home and tnsadmin etc in user.sh file. I added the dsn names which i created the same in windows in odbc.ini file. But NQServer.log says
    *[nQSError: 43059] Init block 'LAST_SYND_DS_YTD_QTD': Dynamic refresh of repository scope variables has failed.*
    *[nQSError: 17001] Oracle Error code: 12154, message: ORA-12154: TNS:could not resolve the connect identifier specified*
    at OCI call OCIServerAttach.
    *[nQSError: 17014] Could not connect to Oracle database.*
    Please let me know any solution for this.
    Thanks,
    Manikandan

    hi Manikandan,
    Please check the tns entries for the connection pool that u assigned for LAST_SYND_DS_YTD_QTD
    thanks,
    saichand

  • Upgrade of Database with Oracle Change Data Capture

    Hello,
    We are upgrading and moving our database to a different server.
    The move is from 10G R1 database on Solaris to 11G R2 on Linux.
    Our database uses Oracle Change Data Capture.
    What is the best way to perform this migration? Unlike in the approach below, ideally, it would be without any manual steps to drop and recreate CDC subscriptions, change tables, etc.
    Thanks.
    Considerations for Exporting and Importing Change Data Capture Objects
    http://docs.oracle.com/cd/B13789_01/server.101/b10736/cdc.htm#i1027532
    Starting in Oracle Databse 10g, Oracle Data Pump is the supported export and import utility for Change Data Capture. Change Data Capture change sources, change sets, change tables, and subscriptions are exported and imported by the Oracle Data Pump expdp and impdp commands with the following restrictions:
    After a Data Pump full database import operation completes for a database containing AutoLog Change Data Capture objects, the following steps must be performed to restore these objects:
    1. The publisher must manually drop the change tables with the SQL DROP TABLE command. This is needed because the tables are imported without the accompanying Change Data Capture metadata.
    2. The publisher must re-create the AutoLog change sources, change sets, and change tables using the appropriate DBMS_CDC_PUBLISH procedures.
    3. Subscribers must re-create their subscriptions to the AutoLog change sets.

    Hello,
    I opened SR with Oracle Support, they are suggesting to perform full database import/export
    Change Data Capture change sources, change sets, change tables, and subscriptions are exported and imported by the Oracle Data Pump expdp and impdp commands with the following restrictions.
    Change Data Capture objects are exported and imported only as part of full database export and import operations (those in which the expdp and impdb commands specify the FULL=y parameter). Schema-level import and export operations include some underlying objects (for example, the table underlying a change table), but not the Change Data Capture metadata needed for change data capture to occur."
    CDC has different implementation methods:
    You may use the below query to determine-
    select SOURCE_NAME, SOURCE_DESCRIPTION, CREATED, SOURCE_TYPE, SOURCE_DATABASE, SOURCE_ENABLED from change_sources;
    – Synchronous CDC: with this implementation method you capture changes
    synchronously on the source database into change tables. This method uses
    internal database triggers to enable CDC. Capturing the change is part of the
    original transaction that introduces the change thus impacting the performance
    of the transaction.
    – Asynchronous Autolog CDC: this implementation method requires a staging
    database separate from the source database. Asynchronous Autolog CDC uses
    the database's redo transport services to transport redo log information from
    the source database to the staging database1. Changes are captured at the
    staging database. The impact to the source system is minimal, but there is some
    latency between the original transaction and the change being captured.
    As suggested in the document-
    Change Data Capture objects are exported and imported only as part of full database export and import operations (those in which the expdp and impdb commands specify the FULL=y parameter). Schema-level import and export
    operations include some underlying objects (for example, the table underlying a change table), but not the Change Data Capture metadata needed for change data capture to occur.
    ■ AutoLog change sources, change sets, and change tables are not supported.
    Starting in Oracle Database 10g, Oracle Data Pump is the supported export and import utility for Change Data Capture.
    Re-Creating AutoLog Change Data Capture Objects After an Import Operation
    http://docs.oracle.com/cd/B19306_01/server.102/b14223/cdc.htm#i1027532
    After a Data Pump full database import operation completes for a database containing AutoLog Change Data Capture objects, the following steps must be performed to restore these objects:
    a. The publisher must manually drop the database objects underlying AutoLog Change Data Capture objects.
    b. The publisher must re-create the AutoLog change sources, change sets, and change tables using the appropriate DBMS_CDC_PUBLISH procedures.
    c. Subscribers must re-create their subscriptions to the AutoLog change sets.

  • Requirement with Oracle report builder

    Hi All,,
    I had a requirement like this...i am very new to Oracle report builder tool and Oracle APPS
    i need to develop a sample report in this format.......please help me
    which tables i need to use??
    i know how to create user parameters,field columns...etc i have 50% knowledge on Oracle report builder

    You can make start with
    Oracle Reports Tutorial...
    Here few link...
    1. Tutorial 1
    2. Basic--[2] Oracle Reports 10g Tutorials
    3. About Oracle Report 10g: Triggers in Report Data Source and Report Style
    Hope this helps
    hamid

  • JBO-26061 with Oracle 8i Lite.

    Hi:
    I am using J Developer3.1 with Oracle 8i Lite.
    I created the business objects.
    Tested the business objects using the Application tester. Every thing works fine.
    Create a Java Form add Connection inoformation and record set's and all the relationships. Every thing worked fine.
    After compileing Both projects tried to run the form and I am getting DAC-405 and JBO-26061. Error. I do have deploy user id password checked on the connection manager.
    First Qestion? Where do I change the Oracle8i connect string to use a 3 part connection syntax. (Is this the problem)

    as you exprimented triggers & procedures, Expriment PL/SQL also

  • LDAP syncronization with Oracle DB. Related questions.

    Hello everybody,
    The problem / objective: I have an Oracle DB with information for clients (ie). And I want to store that information in an LDAP server and have that information syncronized with the Oracle DB.
    I alredy read A LOT about this but no luck. I have tried ApacheDS and Microsoft AD for the LDAP servers. I was able to install both of them and I was able to create a trigger in my OracleDB in order to trigger actions and add/delete/update records on the LDAP server. But, I need that the trigger works both ways. So I need a trigger in ApacheDS or AD. And here is the problem. This is not supported.
    While reading about this, I found information about Oracle Internet Directory. So, here is my first questions:
    1- Is OID an Ldap Server? I believe it is, but, is like Apache DS or AD? (In other words, is like a service which I can connect and implement CRUD operations?)
    2- OID is just supported/deplyed with Oracle 11g?
    3- Can I synchronize my Oracle DB with OID? I mean, if a change is made in the Oracle DB then this change is implemented on the OID, and backwards. (like triggers, in both ways)
    4- If OID is like AD, then, is this the best LDAP server to use?
    And talking about Microsoft Active Directory (AD), how can I achieve this?. I read about third party tools to do this (Quest Quick Connect, Microsoft Forefront Identity Manager), but I want to find another solution like triggers. (If is possible, if not.. then is not a solution :) ).
    Probably the questions are not very clear (the problem is clear), but I have a mess in my head now.. and I want some council about this.
    Thanks in advance.

    Hi,
    Firstly I would suggest you to upgrade your database from Oracle Release 11.2.0.1.0 to Oracle Release 11.2.0.2 . This is the recommended Oracle 11g database version  for SAP solutions. Many of your problem will get resolved with it.
    Question 1:
    So my first question would be is there any other suggestions besides adjusting the mentioned parameter above in order to ensure that no work processors going into hang state due to RFCs' occupying it as this issue always happens at the end of the month only when there are massive users accessing it.
    For immediate resolution the approach you have followed is correct viz limiting number of dialog processes for RFC. Secondly you need to analyze why RFC processing takes so much time. You need check which programs are getting executed by those RFC.
    Generate EarlyWatch report for more detailed view
    Question 2:
    My second question is what went wrong with the libttsh11.so file. How could it be 0 size in PRD when no signs of changes had happen to the PRD system. Is this a proven Oracle Bug or something else since I have never encountered anything like this before.
    The libttsh11.so library cannot be found in the related directory.
    Cause
    The file system is mounted using CIO option, but per Note 257338.1 Direct I/O (DIO) and Concurrent I/O (CIO) on AIX 5L, an ORACLE_HOME on a filesystem mounted with "cio" option is not supported.
    Such a configuration will cause, installation, relinking and other unexpected problems.
    Solution
    Disable the CIO option on the filesystem.
    References
    NOTE:257338.1 - Direct I/O (DIO) and Concurrent I/O (CIO) on AIX 5L
    Hope this helps.
    Regards,
    Deepak Kori

  • Using OWB mappings with Oracle CDC/Streams and LCRs

    Hi,
    Has anyone worked with Oracle Streams and OWB? We're looking to leverage Streams to update our data warehouse using Streams to apply changes from the transactional/source DB. At some point we seem to remember hearing that OWB could leverage Streams, perhaps even using the Logical Change Records (LCRs) from Streams as input to mappings?
    Any thoughts much appreciated.
    Thanks,
    Jim Carter

    Hi Jim,
    We've built a fairly complex solution based on streams. We wanted to break up the various components into separate entities so that any network failure or individual component failure wouldn't cause issues for the other components. So, here goes:
    1) The OLTP source database is streaming LCR's to our Datawarehouse where we keep an operational copy of production, updated daily from those streams. This allows for various operational reports to be run/rerun in a given day with the end-of-yesterday picture without impacting the performance on the source system.
    2) Our apply process on the datamart side actually updates TWO copies of data. It does a default apply to our operational copy of production, and each of those tables have triggers that put a second copy of the data into daily partitioned tables. So, yesterday's partitions has only the data that was actually changed yesterday. After the default apply, we walk the Oracle dependency tree to fill in all of the supporting information so that yesterday's partition includes all the data needed to run our ETL queries for that day.
    Example: Suppose yesterday an address for a customer was updated. Streams only knows about the change to the address record, so the automated process would only put that address record into the daily partition. The dependency walk fills in the associated customer, date of birth, etc. data into that partition so that the partition holds all of the related data to that address record for updates without having to query against the complete tables. By the same token, a change to some other customer info will backfill in the adress record for this customer too.
    Now, our ETL queries run against views created against these partitoned tables so that they are only looking at the data for that day (the view s_address joins from our control tables to the partitiond address table so that we are only seeing one day's address records). This means that the ETL is running agains the minimal subset of data required to update dimensions and create facts. It also means that, for example, if there is a problem with the ETL we can suspend running ETL while we fix a problem, and the streaming process will just go on filling partitions until we are ready to re-launch ETL and catch up - one day at a time. We also back up the data mart after each load so that, if we discover an error in ETL logic and need to rebuild we can restore the datamart to a given day and then reprocess the daily partitions in order very simply.
    We have added control fields in those partitioned tables that show which record was inserted/updated/or deleted in production, and which was added by the dependency walk so, if neccessary, our ETL can determine which data elements were the ones that changed. As we do daily updates to the data mart as our finest grain, this process may update a given record in a given partition multiple times so that the status of this record at the end of the day in that daily partition shows the final version of that record for the day. So, for example, if you add an address record an then update it on the same day the partition for that day will show the final updated version of the record, and the control field will show this to be a new inserted record for the day.
    This satisfies our business requirements. Yours may be different.
    We have a set of control tables which manage what partition is being loaded from streams, and which have been loaded via ETL to the datamart. The only limitation is that, of course, the ETL load can only go as far as the last partition completely loaded and closed from streams. And we manage the sizing of this staging system by pruning partitions.
    Now, this process IS complex, and requires a fair chunk of storage, but it provides us with the local daily static copy of the OLTP system for running operational reports against without impacting production, and a guaranteed minimal subset of the OLTP system for speedy ETL runs.
    As for referencing LCRs themselves, we did not go that route due to the dependency issues (one single LTR will almost never include all of the dependant data from which to update a dimension record or build a fact record, so we would have had to constantly link each one with the full data set to get all of that other info).
    Anyway - just thought our approach might give you some ideas as you work out your own approach.
    Cheers,
    Mike

  • Integration of OBIEE 10.1.3.4.1 with Oracle Powerbuilder

    Hi..
    Currently we are having the integration of OBIEE with Oracle Powerbuilder for displaying the reports developed in OBIEE. For displaying the Answers in Powerbuilder, the integration technique is not working.
    Please help me.
    Edited by: user1073817 on Oct 20, 2011 2:06 PM

    Yes, It has impact. You create groups in the Repository & Answers and assign the object level permissions.
    You Populate Group Variable during authentication via LDAP server. Once you login with X name you see the authorized groups in the my account.
    For dashboard A - For group Executive - User X - You have given full access.
    Now you have changed the Group name to AD_Executive. When You Login variable values would be
    User - X
    Group - Ad_Executive
    Dashboard A - No permissions.
    If you have a scenario of changing the group names then get Groups from database using Init block after authorization.

Maybe you are looking for