Db link tables in processes and row fetch

I have an application I developed in schema A that is referencing tables in schema B. I have been told that these tables need to be moved to another database for which I have created a database link. I know this database link is working because it tests ok and I can pull user names from a user table in an item.
Also, on the first page I have a report that sources from the dblink:
SELECT TRK_CALLS.ID, TRK_CALLS.USER_, TRK_CALLS.ASSIGNED_TO, TRK_CALLS.PROBLEM, TRK_CALLS.SOLUTION, TRK_CALLS.STATUS
FROM TRK_CALLS@DBLINK_IM3
WHERE
( STATUS = 'Open' and :P2_REPORT_SEARCH is null )
or
( regexp_like( USER_ || '#' || ASSIGNED_TO || '#' || PROBLEM || '#' ||
SOLUTION || '#' || STATUS, :P2_REPORT_SEARCH, 'i' ) )
This is working correctly.
I have edit buttons for the rows in the report that redirect to page 6. I get any error unable to fetch row.
In the row source I placed
trk_calls@DBLINK_IM3
I only have one selection as table owner WEBB so this may be part of the problem but I am not sure how to fix this. I tried to create a new row fetch under the the owner GIS which is the right owner but it still diplays WEBB once created. Is this even part of the Solution?
Is there a better way to do this? PL/SQL expression?
I have many other processes that will have insert, apply changes, etc to the tables so if I can get this to work I may be on my way to making the db link work.
Please help me in any way possible.
Thanks,
Kirk

Kirk,
you get this error because it is impossible to use the returning clause, probably because you are using a DB link.
You can cicrumvent this by determing the primary key value in the PL/SQL before doing the insert and then including its value in the insert:
Thanks alot for the info on creating the view. It is working to a certain point.
I am getting a 'ORA-22816: unsupported feature with RETURNING clause' error now. I created the view and referenced the view in all of my pages. Simply put, wherever I had TRK_CALLS in a source or SQL expression I replaced it with TRK_CALLS_VIEW.
The application didn't have the error before and the only thing I changed was this. What could be the problem?
This is the expression that the error is referring to:
declare
l_CALL_ID number;
begin
/* determine CALL ID */
i_call_id := get_call_id; /* some function returning new call id */
insert into webb.TRK_CALLS_VIEW
(+id+, USER_, ASSIGNED_TO, PROBLEM, SOLUTION, STATUS)
values
(+i_call_id+,:P3_USER_, :P3_ASSIGNED_TO, :P3_PROBLEM, :P3_SOLUTION, :P3_STATUS)
good luck, DickDral

Similar Messages

  • Link Table between CRM and ERP

    Hi Experts,
    Does anyone knows the link table between CRM and ERP for Contact Person (Buss Part) CRM and Contact Person ERP.
    In my customer system, contacts are automatically duplicated in CRM.
    When you make a change in ERP, this change is automatically done in CRM.
    So I suppose that there is a table between the both systems to find the corresponding number...
    Thank you for your help and hope my English is good enough to understand :o)
    Best regards,
    Luis.

    Hi Micky,
    Thanks for your answer.
    Unfortunatelly, number are differents...
    I move this question to CRM forum ;o)
    Kr,
    Luis.
    This has been solved in this thread:
    Link Table between CRM and ERP
    Edited by: Luis Loredo Marino on Feb 24, 2010 1:24 PM

  • Linked tables, stored procedures, and locking

    I'm working on an interface between two Oracle systems. I don't know if they're on the same server or not, but they are definitely two different database instances. The plan for this interface is that when a record is created on one of the systems, it will call a stored procedure on the other system to create the record there as well. I believe the relevant information will be passed via parameters to the stored procedure (not by querying the data in the first system). The ID number created on the second system gets passed back to the first system via an output parameter.
    A concern was raised about whether something like this might cause "rev-locking". Similar issues were raised on a different interface, but that interface used a linked table between the two systems. The original design for that interface had a stored procedure initiated from the second system, and it was to update data via the linked table in the first system. But this caused some locking issues. So a different interface was written, that only used the linked table as read-only.
    So the question is, do stored procedure calls between linked databases have the same issues as updates directly to linked tables? And a better question is, is there a document or white paper out there somewhere that describes the locking issues between linked databases, and presenting the "best practices" for this type of coding?
    Any help is appreciated!
    Christine Wolak
    [email protected]

    So the question is, do stored procedure calls between linked databases have the same issues as updates directly to linked tables? I'm not aware of any issues with updates across databases. can you post a more detail version of what exact issues did you encounter when updating tables across database using database links?
    when you update a row in a table, from the same database or another one, a lock on that row will be placed for the duration of the transaction. Others will be able to read that row but not update it till the end of the local (or the remote) transaction.
    What issue(s) did you encounter?

  • Oracle 11gr2 ODBC - error updating linked table (Ora 01722 and 01461)

    Good day folks,
    My shop has just moved to 11gR2 client and server. We were previously using 11gR1 with no issues (and before that, 10, 9, 8, etc). After moving from 11r1 to 11r2, we began getting errors from some of our MS Access ODBC applications with linked Oracle tables. The error would occur when executing an UPDATE statement that had a table join in it. Here is a simple example:
    UPDATE TableX SET TableX.Fieldx = “valuex” WHERE TableX.Fieldx = TableZZZ.Fieldx AND TableZZZ.fieldzzz is not null
    Currently, after moving to 11r2 client, an update query like the one above will error out in one of the following ways:
    - odbc -- update on a linked table failed - Ora 01722 invalid number
    - ORA-01461: can bind a LONG value only for insert into a LONG column
    - Or it will say that the records were not updated because they are locked.
    In some cases, I have noticed some records being updated that were not supposed to be updated.. records that the where clause was meant to exclude. That is very unsettling.
    I understand that perhaps an update statement shouldn’t be joining table and perhaps it should be done over a couple calls, but the reality is – this code is out there in abundance and if there is a solution that doesn’t amount to my changing all this code or reverting to 11gR1, I would love to find it.
    Since the query runs fine using SQL Plus and also runs fine if I run it against a local table in Access rather than a linked Oracle table – I figured the issue was possibly with the Oracle 11r2 ODBC driver. So, I switched the Oracle ODBC driver (sqora32.dll version 11.2.0.1 with version 11.1.0.7), and the problem went away.
    I believe this verifies the issue resides with Oracle ODBC version 11.2.0.1. Can anyone help? I'm assuming it's not particularly wise to simply swap sqora32.dll files on all my clients machines, so I am searching for an actual solution here instead.
    I also did performed ODBC tracing to see what Access is handing to the Oracle ODBC driver. I then used database or SQLNet tracing to see what the ODBC driver was handing off to SQLNet/database.
    The results are in the following post:
    Thanks guys!!

    SQLNET TRACE
    If you want an Admin level trace, I can have one right away.
    (856) [13-JUN-2010 22:11:00:657] nsopen: opening transport...
    (856) [13-JUN-2010 22:11:00:657] nttcni: Tcp conn timeout = 60000 (ms)
    (856) [13-JUN-2010 22:11:00:657] nttcni: trying to connect to socket 1364.
    (856) [13-JUN-2010 22:11:00:688] nttcni: connected on ipaddr 142.139.221.62
    (856) [13-JUN-2010 22:11:00:688] nttcon: set TCP_NODELAY on 1364
    (856) [13-JUN-2010 22:11:00:688] nsopen: transport is open
    (856) [13-JUN-2010 22:11:00:688] nsnainit: inf->nsinfflg[0]: 0x61 inf->nsinfflg[1]: 0x61
    (856) [13-JUN-2010 22:11:00:688] nsopen: global context check-in (to slot 0) complete
    (856) [13-JUN-2010 22:11:00:688] nscon: doing connect handshake...
    (856) [13-JUN-2010 22:11:00:688] nscon: sending NSPTCN packet
    (856) [13-JUN-2010 22:11:00:688] nscon: sending 233 bytes connect data
    (856) [13-JUN-2010 22:11:00:688] nsdo: 233 bytes to NS buffer
    (856) [13-JUN-2010 22:11:00:719] nscon: got NSPTRS packet
    (856) [13-JUN-2010 22:11:00:719] nscon: sending NSPTCN packet
    (856) [13-JUN-2010 22:11:00:719] nscon: sending 233 bytes connect data
    (856) [13-JUN-2010 22:11:00:719] nsdo: 233 bytes to NS buffer
    (856) [13-JUN-2010 22:11:00:735] nscon: got NSPTAC packet
    (856) [13-JUN-2010 22:11:00:735] nscon: connect handshake is complete
    (856) [13-JUN-2010 22:11:00:735] nscon: nsctxinf[0]=0x61, [1]=0x21
    (856) [13-JUN-2010 22:11:00:735] nsnainconn: inf->nsinfflg[0]: 0x61 inf->nsinfflg[1]: 0x21
    (856) [13-JUN-2010 22:11:00:735] nsnasend: bytes to send: 158
    (856) [13-JUN-2010 22:11:00:735] nsdo: 158 bytes to NS buffer
    (856) [13-JUN-2010 22:11:00:735] nsnareceive: buffer address: 0x132c34 bytes wanted: 2048
    (856) [13-JUN-2010 22:11:00:735] nsnareceive: calling NS to receive 2048 bytes into address 0x132c34
    (856) [13-JUN-2010 22:11:00:766] nsdo: 153 bytes from NS buffer
    (856) [13-JUN-2010 22:11:00:766] nsnareceive: received 153 bytes
    (856) [13-JUN-2010 22:11:00:766] nsnareceive: no more data to receive - returning
    (856) [13-JUN-2010 22:11:00:766] nsnareceive: total bytes received: 153
    (856) [13-JUN-2010 22:11:01:063] nsnasend: bytes to send: 77
    (856) [13-JUN-2010 22:11:01:063] nsdo: 77 bytes to NS buffer
    (856) [13-JUN-2010 22:11:01:063] nsnareceive: buffer address: 0x132c34 bytes wanted: 2048
    (856) [13-JUN-2010 22:11:01:063] nsnareceive: calling NS to receive 2048 bytes into address 0x132c34
    (856) [13-JUN-2010 22:11:01:079] nsdo: 64 bytes from NS buffer
    (856) [13-JUN-2010 22:11:01:079] nsnareceive: received 64 bytes
    (856) [13-JUN-2010 22:11:01:079] nsnareceive: no more data to receive - returning
    (856) [13-JUN-2010 22:11:01:079] nsnareceive: total bytes received: 64
    (856) [13-JUN-2010 22:11:01:079] naun5authent: Authentication type is 0
    (856) [13-JUN-2010 22:11:01:079] nsnasend: bytes to send: 1862
    (856) [13-JUN-2010 22:11:01:079] nsdo: 1862 bytes to NS buffer
    (856) [13-JUN-2010 22:11:01:079] nsnareceive: buffer address: 0x132c34 bytes wanted: 2048
    (856) [13-JUN-2010 22:11:01:079] nsnareceive: calling NS to receive 2048 bytes into address 0x132c34
    (856) [13-JUN-2010 22:11:01:141] nsdo: 165 bytes from NS buffer
    (856) [13-JUN-2010 22:11:01:141] nsnareceive: received 165 bytes
    (856) [13-JUN-2010 22:11:01:141] nsnareceive: no more data to receive - returning
    (856) [13-JUN-2010 22:11:01:141] nsnareceive: total bytes received: 165
    (856) [13-JUN-2010 22:11:01:141] nsnasend: bytes to send: 33
    (856) [13-JUN-2010 22:11:01:141] nsdo: 33 bytes to NS buffer
    These lines are present using both version of sqora32.dll
    (856) [13-JUN-2010 22:11:01:141] nszgwop: SQLNET.WALLET_OVERRIDE not found, using default.
    (856) [13-JUN-2010 22:11:01:157] nscontrol: Vect I/O support: 0(856) [13-JUN-2010 22:11:01:391] nioqrc: Recieve: returning error: 3111
    (856) [13-JUN-2010 22:11:01:391] nsdo: sending NSPTMK packet
    (856) [13-JUN-2010 22:11:01:391] nserror: nsres: id=0, op=77, ns=12630, ns2=0; nt[0]=0, nt[1]=0, nt[2]=0; ora[0]=0, ora[1]=0, ora[2]=0
    These lines only happen when using the R2 version of sqora32.dll
    (856) [13-JUN-2010 22:11:01:719] nioqrc: Recieve: returning error: 3111
    (856) [13-JUN-2010 22:11:01:719] nsdo: sending NSPTMK packet
    (856) [13-JUN-2010 22:11:01:860] nserror: nsres: id=0, op=0, ns=12630, ns2=0; nt[0]=0, nt[1]=0, nt[2]=0; ora[0]=0, ora[1]=0, ora[2]=0
    (856) [13-JUN-2010 22:21:03:782] nstimarmed: no timer allocated

  • What is the difference among link table toa01, toa02 and toa03?

    Can experts explain their functionalities?  Thanks!

    There is no difference among the 3 tables.  The three tables are provided by default to allow flexibility for storing link entries across different database areas.
    Additional Link tables can be created asa long as they use TOAV0 as a template.

  • Linking tables between Pages and Numbers

    I would like to link a table in Numbers to Pages so that the table can auto update in Pages when it is changed in Numbers. Anyone know if this is possible?
    thanks!

    This is not possible at the present time.
    Regards,

  • Modify all partner links of a process

    hello,
    what i am thinking about is to modifiy all partner links used in a process.
    As described in the cookbook you can change the endpoint reference of a parnter link throug an assert statement.
    What i did up to now is to look up the partner links through the api. I have the names of the partner links, etc now in my process workflow stored in a variable. Iterating over the links works fine. Now i want to get and afterwards modifiy the endpoint reference of every partner link in the process. To fetch the information about the endpoint reference of a partner link i tried to use a statement like that:
    <assign name="assign-1">
    <copy>
    <from partnerLink="bpws:getVariableData('link')" endpointReference="..."/>
    <to variable="partnerReference"/>
    </copy>
    </assign>
    the link variable holds the name of a partner link.
    Of course that didn't work :)
    The BPEL4WS spec. 1.1 says that the from-spec has to look like that
    <from partnerLink="ncname" ......
    That would mean that there isn't a possibility to bring any dynamic in that expression
    Any ideas on that?? I mean can i dynamically fill the from and to expression....
    Greets Jens

    There is a whitepaper out that describes dynamic functions for adapter out, I believe the pdf is named bpeltechadp.pdf , search for that on metalink.

  • SET mode processing versus ROW for Dimension targets

    Hi all,
    In OWB 10.2.0.2 I am using a relational implementation of a Dimension with 9 levels (and 2 hierarchies).
    I selected target load order and have run the mapping with SET processing and ROW based (target only).
    The problem is that the results are not the same! Both run without errors but SET processing does not correctly associate the levels together in the dimension. When I run ROW based, the levels do connect properly. They are supposed to be equivalent - there is nothing in the code that is "row based only" in terms of transformations and Control center reports no errors in either case (though we all know that Control center sucks at error reporting). Logs also show no errors.
    Has anyone else seen this and more importantly is there a workaround?
    The Dimension has 17 million rows and takes 15 hours in SET processing.
    Row processing takes 3 days of processing to complete.
    I've logged an SR but have more faith in the responses here than in Support.
    This has been a 3 SR day for me with OWB - an average day I suppose. I have applied the patch and was hoping for more stability and consistency. Guess not.
    Any help would be appreciated. Also, what escalation procedures do you use when your SRs end up in documented and undocumented BUG numbers with no solutions?

    Mike,
    we have several dimensions that work fine. One is a custom time dimension with day / month / quarter / year levels. The other has 6 levels, split between two hierarchies that are each 4 levels deep.
    Haven't had any problems at all with them. Can't speak to running them in row based mode - because we've never had any issues, I haven't really even tried it.
    One problem that I have run into that may apply in your case - is it possible that you have different parent/child links for a given dimension value in the data? For instance, is it possible that you have one row of "source" data that says JANUARY rolls up to QTR1, but a second row in the same data that has JANUARY rolling up to QTR2? Something like this might produce exactly what you're seeing - you'd get hosed up results in the SET mode, but in ROW mode it would apply those changes one by one and potentially not have the problems (depending on how the data was sorted)
    We saw this when one our our maps tried to map more than 1 description to a single dimension value (we had an "UNKNOWN" dimension value, but we were writing > 1 description to it accidentally). In that case, our loads would usually fail with an error message saying something about "Couldn't get a stable set of source data" or somesuch.
    Hope this helps,
    Scott

  • Many to Many PL/SQL - how to nest a linking table?

    I have two objects that are linked with a many-to-many relations. These two objects have to be nested inside a third object. How should I do ?
    I created the type and the table for A, I did the same for B and I finally created the linking table for A and B. How shall I nest this table inside the object C?
    Thank you very much for your help as I am new to PL SQL.

    I need to create one object called Container. From a modelling perspective, this object has contains all the other objects.
    The other objects are items and composed items. The Container contains the items (it is a one-to-many relationships so I created a nested table) but it also contains composed items, which are items linked to parts and this is a many-to-many relationship. How do I do to include the many-to-many in the object container? I included question marks when I did not know. Thank you
    CREATE OR REPLACE TYPE Item_type (
    Id NUMBER (6),
    Type CHAR (7)) ;
    CREATE OR REPLACE TYPE Part_type (
    Id NUMBER (6),
    Type CHAR (7)) ;
    CREATE TABLE Item OF Item_type
    (Id NOT NULL,
    PRIMARY KEY (Id));
    CREATE TABLE Part OF Part_type
    (Id NOT NULL,
    PRIMARY KEY (Id));
    CREATE TABLE ComposedItems
    (ItemId REF Item_type ,
    PartId REF Part_type);
    CREATE OR REPLACE TYPE ComposedItems_nested as TABLE OF *????????*
    CREATE OR REPLACE TYPE Item_type (
    Id NUMBER (6),
    Type CHAR (7));
    CREATE OR REPLACE TYPE Item_nested as TABLE OF Item_type;
    CREATE OR REPLACE TYPE Container_type (
    Id NUMBER (6),
    Type CHAR (7),
    Item Item_nested,
    /

  • Tables EKKO EKPO and KONP

    Hi,
        I want to link tables EKKO, EKPO and KONP. Actually i checked this tables  but i could not find link between EKKO and KNOP. Please help me in this regards. Actually we are maintaining the tax codes in the PO. I want to print different taxes in PO out put like BED, EDUCATION CESS, CST, VAT.
    NK

    Hello,
    You can link the tables by
    EKKO- KNUMV = KONP-KNUMH.
    And
    EKKO-EBELN = EKPO-EBELN.
    Hope it helps.
    Regards,
    Mansi.

  • How to trigger the automated row fetch process and open modal window by javascript api?

    Hi,
    I would like to click the one row of column of IR report, to open the modal window of current page.  <----------------it is ok. I can use "javascript:openModal('windowID')"  to do it.
    There is one form in this modal window, Meanwhile, I would like to pass column data to this form.    <--------------------- it is ok also. I can use " $s('P7_ID','column_value');" to do it.
    But I don't know how to trigger the "automated row fetch" process of this form to retrieve other field's value in this form.   
    I tried to use following 2 ways. But failed.
    First method:
    add one ajax process of "automated row fetch" in "page processing" block, named "get_fetch_data"
    when click IR column , call "openModal", and call  "apex.server.process ( "get_fetch_data", {}, { success: function( pData ) { }  } );"  , I tried to call above ajax process to refresh form. It is failed.
    Second method:
    add one process of  "automated row fetch" in "page rendering" block, named "get_fetch_data"
    when click IR column, call javascript api "apex.submit" to submit current page , then call "openModal".
    such as :  javascript:apex.submit({request:'MODIFY',set:{'P7_ID': #ID#}}); openModal('trade');
    But it is failed also. the modal page is showed firstly. then page refresh. but modal window will not open again.
    I am not sure if my thinking is right. Could you please provide any suggestion?
    Thanks in advance,
    Ping

    Hi Ping,
    You can try to set the session state of your modal page's primary key before opening the modal page. Use one dynamic action (on click of IR row) with two true actions. First one to set session state of modal page pk, second on to open modal page.
    Or you can add the modal page url as link in your report by extending your query:
    select ...
    ,         apex_util.prepare_url( 'f?p='||:APP_ID||':7:'||:APP_SESSION||'::'||:DEBUG||':7:P7_ID'||COLUMN_VALUE ) as link
    from ...
    This will give you the url of the modal page, with set primary key.
    Regards,
    Vincent Deelen
    http://vincentdeelen.blogspot.com

  • Table and Primary Key Case Sensitive in Automated Row Fetch

    Why are the table and primary key fields case sensitive in the Automated Row Fetch process? I'm debugging an error in this process and I'm not sure what the case should be.

    Russ - It's a defect of sorts. Use upper case in the process definition. I don't think these processes will work with tables or columns whose names are not known to the database as upper-case values, i.e., defined using double-quoted non-upper case.
    Scott

  • Bug Report: Automatic Row Fetch/Automatic Row Processing and invalid column

    Hi,
    if an invalid column name is specified for a page item of type "Database Column", no error is reported by the Automatic Row Fetch/Automatic Row Processing process. It's just ignored and nothing is returned!!! That makes it kind of hard to identify such an error.
    The problem can happen for example when a column is renamed or if there is a typo in the column name during creation.
    The problem seems to be related to the following query which doesn't use an outer join.
    SELECT C.COLUMN_NAME, C.DATA_TYPE, I.NAME, I.FORMAT_MASK, I.DISPLAY_AS
    FROM
    SYS.ALL_TAB_COLUMNS C, WWV_FLOW_STEP_ITEMS I WHERE C.OWNER = :B4 AND
      C.TABLE_NAME = :B3 AND I.SOURCE = C.COLUMN_NAME AND I.SOURCE_TYPE =
      'DB_COLUMN' AND I.FLOW_ID = :B2 AND I.FLOW_STEP_ID = :B1 AND (C.DATA_TYPE
      IN ('DATE','NUMBER','CLOB','LONG','FLOAT') OR C.DATA_TYPE LIKE 'NUMBER%' OR
      C.DATA_TYPE = 'CHAR' OR C.DATA_TYPE LIKE '%VARCHAR%' OR C.DATA_TYPE LIKE
      'TIMESTAMP%') ORDER BY C.COLUMN_IDBTW, if an invalid table/view name is specified for a Automatic Row Fetch/Automatic Row Processing process, the error message during runtime isn't very speaking.
    ORA-06550: line 1, column 17: PL/SQL: ORA-00936: missing expression ORA-06550: line 1, column 9: PL/SQL: SQL Statement ignoredMaybe that could be changed too.
    Thanks
    Patrick
    My APEX Blog: http://www.inside-oracle-apex.com
    The APEX Builder Plugin: http://builderplugin.oracleapex.info/
    The ApexLib Framework: http://apexlib.sourceforge.net/

    It says filed (with an L) not fixed (with an X). That means they have noted down the bug, but it may or may not have been fixed yet.

  • Page having PL/SQL process and Automatic Row Process for 2 different tables

    Hi,
    I have a page containing 2 regions A & B.
    Region-A content would be updated to table T1(PK : Ticket#).
    Region-B content would be inserted into table T2(PK: Attachment# ; FK: Ticket#).
    Region-B is used for uploading a file content into T2.
    Since I cannot use 2 DML processes on the page for 2 different tables with a common column, so I have a PL/SQL process to update the record into T1 and an Automatic Row Process(DML) for inserting into T2.
    Now the issue is in Region-B when I select a file using 'Browse' button and click on Upload button to fire the Automatic Row Process, the success message is displayed but the file is not uploaded into the table. But when I moved the entire Region-B and the Automatic Row Process to a different page and clicking Upload is working fine and inserting the record into the table along with the file content.
    An item P10_TICKET_NUMBER with source type as Database column with source value as TICKET_NUMBER is used in Region-A.
    I have gone through the forums and found some of the threads below
    Re: 2 Automated Row Processes for one page?
    Re: Error when trying to create 2 Forms on same page on 2 tables with ID as
    where people facing similar issues but here I have followed the solution provided(with one PL/SQL process and other Automatic process) in the threads but still issue persists.
    Can anyone throw some light on this please.
    Thanks,
    Raj.

    Hi Teku,
    You just have a look at this thread, where u can find a solution for your problem.
    INSERTING Records into Second table based on First table Primary Key
    hope this helps.
    Bye,
    Srikavi

  • Custom row-fetch and how to get column values from specific row of report

    Hi -- I have a case where a table's primary key has more than 3 columns. My report on the
    table has links that send the user to a single-row DML form, but of course the automatic
    fetch won't work because 1) I can't set more than 3 item values in the link and 2) the
    auto fetch only handles 2 PK columns.
    1)
    I have written a custom fetch (not sure it's the most elegant, see second question) that is working
    for 3 or few PK columns (it references the 1-3 item values set in the link), but when there are
    more than 3, I don't know how to get the remaining PK column values for the specific row that was
    selected in the report. How can I access that row's report column values? I'll be doing it from the
    form page, not the report page. (I think... unless you have another suggestion.)
    2)
    My custom fetch... I just worked something out on my own, having no idea how this is typically
    done. For each dependent item (database column) in the form, I have a source of PL/SQL
    function that queries the table for the column in question, using the primary key values. It works
    beautifully, though is just a touch slow on my prototype table, which has 21 columns. Is there
    a way to manually construct the fetch statement once for the whole form, and have APEX be smart
    about what items get what
    return values, so that I don't have to write PL/SQL for every item? Because my query data sources
    are sometimes in remote databases, I have to write manual fetch and dml anyway. Just would like
    to streamline the process.
    Thanks,
    Carol

    HI Andy -- Well, I'd love it if this worked, but I'm unsure how to implement it.
    It seems I can't put this process in the results page (the page w/ the link, that has multiple report rows), because the link for the row will completely bypass any after-submit processes, won't it? I've tried this in other conditions; I thought the link went directly to the linked-to page.
    And, from the test of your suggestion that I've tried, it's not working in the form that allows a single row edit. I tried putting this manually-created fetch into a before header process, and it seems to do nothing (even with a hard-coded PK value, just to test it out). In addition, I'm not sure how, from this page, the process could identify the correct PK values from the report page, unless it can know something about the row that was selected by clicking on the link. It could work if all the PK columns in my edit form could be set by the report link, but sometimes I have up to 5 pk columns.
    Maybe part of the problem is something to do with the source type I have for each of the form items. With my first manual fetch process, they were all pl/sql functions. Not sure what would be appropriate if I can somehow do this with a single (page level?) process.
    Maybe I'm making this too hard?
    Thanks,
    Carol

Maybe you are looking for

  • I can't open an .ai file in Illustrator without a dialog box appearing "Open PDF."  I have no idea what I can't open it as an Illustrator file (and not a PDF)

    Once I open a page as a PDF, all my layer info is lost.  I've tried opening it on two machines and the dialog box continues to appear.  Again, it's not a PDF extension.  Thanks for the help!

  • How to create an activity bar in Form?

    Hi all: I am using Developer 6i with Oracle 9i. I have developed a form, the form consists different button thru which I am calling different packages for different tasks. When user click any button e.g. "Generate Over time" for the current month, th

  • Version 9.0.1.6.0

    I go to a gaming site that has problems with the most recent Flash Player version. My slightly older laptop has version 9.0.1.6.0 installed which works fine. How can I download this version on my other computer? Adobe's site will only let me download

  • SAP Query iview  properties

    Hi Portal guru I am trying to create SAP Query iview but I am facing some problem. In EP 6.0 there was a standard template for creating Query iview, but there is no standard template in EP 7, but we should create a new iview from par com.sap.pct.srv.

  • OEI will not uninstall

    We have a problem in which Outlook Email Integration will not uninstall. In Windows XP, in Control Panel | using Add/Remove Software, OEI goes through the uninstall process but it doesn't actually get removed. There is still an entry in Add/Remove So