REMOVE_ELEMENT in a Table with ( tree by key Column )
Hi all,
Designed a table with Tree by key column ( a Normal table with tree )
In one of the row there is a Drop down field provided for selection.
We have delete button provided to the customer.
In this delete button handeled removing the Lead selected row using the method REMOVE_ELEMENT of COntext_node.
so in the context node if we are having 15 elements. The particluar hierarchy selected can have minimun 4 elements
In the UI display the total node( total hierarchy node) is getting deleted but in the debugging mode the node is having 14 elements.
Kindly suggest how to handle it so that in the node also i have the total hierarchy deleted.
Thank you,
Usha
Hello Usha,
For this you need write the logic. If you are deleting a context element of ID say 'ROW1', then you need to take care of deleting all the context element which has parent key as 'ROW1'.
BR, Saravanan
Similar Messages
-
Creating form base on a table with more than two columns key
Hi all,
I'trying to develop an application with HTMLDB and I'm facing a problem. I have a table with a primary key of 3 columns and I'm not able to load the data into the html db form.The web flow (generated with the wizard) is a browse page with a link to edit page. In the edit page I deleted the standard process and added a custom one.
This process is PL/SQL anon block and fires onload after header.
The code is the following:
Select ENABLED,PARAMETERS,CHECK_LEVEL into :P16_ENABLED,:P16_PARAMETERS,:P16_CHECK_LEVEL FROM CHECK_SETS_CHECKS
WHERE CHECK_ID=:P16_CHECK_ID AND CHECK_SET_ID=:P16_CHECK_SET_ID AND
EXECUTION_ORDER=:P16_EXECUTION_ORDER
i.e. I select the other values using the table primary key.
When the form renders I don't see the 3 value selected from the query but only the pk ones. I know that the the query is executed because if I change a column name to a non existing one I get an error.
What I'm missing?
Thanks,
GiovanniGiovanni,
Check the source type of the items. They should no longer be source type 'Database Column'.
Scott -
Problem with update of BLOB field in a table with compound primary key
Hi,
I've been developing an application in Application Express 3.1.2.00.02 that includes processing of BLOB data in one of the tables (ZPRAVA). Unfortunately, I've come across a strange behaviour when I tried to update value in a BLOB field for an existing record via a DML form process. Insert of a new record including the BLOB value is OK (the binary file uploads upon submiting the form without any problems). I haven't changed the DML process in any way. The form update process used to work perfectly before I'd included the BLOB field. Since than, I keep on getting this error when trying to update the BLOB field:
ORA-20505: Error in DML: p_rowid=3, p_alt_rowid=ID, p_rowid2=CZ000001, p_alt_rowid2=PR_ID. ORA-01008: not all variables bound
Unable to process row of table ZPRAVA.
OK
Some time ago, I've already created another application where I used similar form that operated on a BLOB field without problems. The only, but maybe very important, difference between both the cases is that the first sucessfull one is based on a table with a standard one-column primary key whereas the second (problematic one) uses a table with compound (composite) two-column PK (two varchar2 fields: ID, PR_ID).
In both cases, I've followed this tutorial: [http://www.oracle.com/technology/obe/apex/apex31nf/apex31blob.htm]).
Can anybody confirm my suspicion that Automatic Row Processing (DML) can be used for updating BLOB fields within tables with only single-column primary keys?
Thanks in advance.
ZdenekIs there a chance that the bug will be included in the next patch?No, this fix will be in the next full version, 3.2.
Scott -
Insert values of form on a table with compound primary keys in APEX
Good evening,
I have a table with compoud primay keys. On ER model I had a many to many relation and when I coverted to relational model I've got a table with the primary keys of the two entities.
My problem now is that I am trying to build an APEX application with a Form/Report and I want to insert a tuple both on the "entity" and on the table with two primary keys.
I created a trigger with
CREATE OR REPLACE TRIGGER ins_key_elem
INSTEAD OF INSERT
ON v_key_elem
REFERENCING NEW AS NEW
FOR EACH ROW
BEGIN
INSERT INTO elem_type (elem_key, type_key)
VALUES (:NEW.elem_key, :NEW.type_key);
INSERT INTO elem(elem_key, name)
VALUES (:NEW.elem_key, :NEW.name);
END;
Is that ok? If it is, now how I can use that trigger to do what I need? i.e. insert the tuple on the two tables when I use the create button of the form
Edited by: 934530 on May 15, 2012 2:36 PMNevermind...figured it out. Was able to set the value on the Report Attributes page.
-
Updating a table with no Primary key
What I am trying to find out is if you can uniquely update a single record in a table that has no primary key. I see no way to reference the record I want, other than specifying all the field values in a where clause. This is a problem though, because if I have another record with the same field values, that will get updated too. (I know its a bad database design, but some people do it and I need to work around it!!!), there are some situations where you would have identical records as far as a where clause is concerned, for example, you cant use a blob in a where clause so that might be where your records differ...
In SQLServer 2000, Enterprise manager throws an error if you have a table with no primary key and duplicate records that you try to update through the GUI. EM uses stored procedures to execute updates, and it rolls back ones which result in more than one update result.
Im not too familiar with Cursors in SQL, only that they are quite slow and not implemented on all DB products. But I think that they might be able to solve the problem (but I dont logically see how).
Can anybody explain this to me?another record with the same field valuesIf you have two records and all the field values are the same then the records are logically the same anyways, so it shouldn't matter if you update both (one would question why there are two records in the first place.) In other words there is no way, either computationally or manually to tell them apart.
I suspect that that is rather rare. Instead what is more likely is that you are only using some of the fields. So just keep adding fields until it is unique. -
Table with tree view on master column
Hi,
I'm trying to generate a table with tree view on the master column and I get the table values from an RFC.
The problem is that I need to control some properties like "expanded" or "isLeaf" and I can't add the value attributes to the model node to control the rows.
Any ideia on how to overcome this (what i think is a) problem?
Thanks for the help.
Pedro BarbosaHi Subramanian,
Maybe I didn't explain myself right. I also followed that tutorial with success but in this case i have on my custom controller:
-MyNode (model node)
- Output
- MyTable
- attribute 1 (model attribute)
In my context/view controller the only way i can map that node is by using a model node referecing "MyTable" but this way i can't add any of my attributes to same level of the attributes (is that the ideia right?).
I also tried to recreate the structure in the view with value nodes and value attributes with the same type as the model attributes and then bind them but i always get "Incompatible context element type".
Am i missing something?
Pedro Barbosa -
2 tables with same primary key
Hi,
2 tables with same primary key is possible,but when is wise to go for such ?
is that a bad desisn ?
eg:
Personal_details_employee
Official_details_employee
these 2 tables have same Primary key emp_id.
plz. reply me at [email protected] also
thanx
vikramhi Vikram,
In this case u can have then in one table itself and why do want to have two tables.
- Suresh.A -
Mapping tables with no primary key defined
Hi!
I was just wondering if it's possible to map a table with no primary key? The reason why is that I have a logging table that I want to access, which is built very simple and the only useful filtering criteria that I'll be using there is a date column.
Regards,
Bjorn BoeTopLink requires that an object have a primary key to determine identity, perform caching, and database updates and deletes. It is best to define a primary key in all your objects; a sequence number can be used if the object has no natural primary key. The primary key does not have to be defined on the database, as long as a set of fields in the table are unique you can define them to be the primary key in your TopLink descriptor.
If your table truly has no primary key, nor unique set of fields, it is still possible to map the table with some restrictions. If you disable caching, and mark the object as read-only, and never update or delete the object, you can map it for read-only purposes.
You will need to do the following:
- Set any field as the primary key to avoid the validation warning.
- Set the cache type to NoIdentityMap.
- Set the descriptor to be read-only. -
Row chaining in table with more than 255 columns
Hi,
I have a table with 1000 columns.
I saw the following citation: "Any table with more then 255 columns will have chained
rows (we break really wide tables up)."
If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
I tried to insert a row described above and no row chaining occurred.
As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
the block size OR when more than 255 columns are populated. Am I right?
Thanks
dyahavuser10952094 wrote:
Hi,
I have a table with 1000 columns.
I saw the following citation: "Any table with more then 255 columns will have chained
rows (we break really wide tables up)."
If I insert a row populated with only the first 3 columns (the others are null), is a row chaining occurred?
I tried to insert a row described above and no row chaining occurred.
As I understand, a row chaining occurs in a table with 1000 columns only when the populated data increases
the block size OR when more than 255 columns are populated. Am I right?
Thanks
dyahavYesterday, I stated this on the forum "Tables with more than 255 columns will always have chained rows." My statement needs clarification. It was based on the following:
http://download.oracle.com/docs/cd/B28359_01/server.111/b28318/schema.htm#i4383
"Oracle Database can only store 255 columns in a row piece. Thus, if you insert a row into a table that has 1000 columns, then the database creates 4 row pieces, typically chained over multiple blocks."
And this paraphrase from "Practical Oracle 8i":
V$SYSSTAT will show increasing values for CONTINUED ROW FETCH as table rows are read for tables containing more than 255 columns.
Related information may also be found here:
http://download.oracle.com/docs/cd/B10501_01/server.920/a96524/c11schem.htm
"When a table has more than 255 columns, rows that have data after the 255th column are likely to be chained within the same block. This is called intra-block chaining. A chained row's pieces are chained together using the rowids of the pieces. With intra-block chaining, users receive all the data in the same block. If the row fits in the block, users do not see an effect in I/O performance, because no extra I/O operation is required to retrieve the rest of the row."
http://download.oracle.com/docs/html/B14340_01/data.htm
"For a table with several columns, the key question to consider is the (average) row length, not the number of columns. Having more than 255 columns in a table built with a smaller block size typically results in intrablock chaining.
Oracle stores multiple row pieces in the same block, but the overhead to maintain the column information is minimal as long as all row pieces fit in a single data block. If the rows don't fit in a single data block, you may consider using a larger database block size (or use multiple block sizes in the same database). "
Why not a test case?
Create a test table named T4 with 1000 columns.
With the table created, insert 1,000 rows into the table, populating the first 257 columns each with a random 3 byte string which should result in an average row length of about 771 bytes.
SPOOL C:\TESTME.TXT
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
INSERT INTO T4 (
COL1,
COL2,
COL3,
COL255,
COL256,
COL257)
SELECT
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3)
FROM
DUAL
CONNECT BY
LEVEL<=1000;
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SET AUTOTRACE TRACEONLY STATISTICS
SELECT
FROM
T4;
SET AUTOTRACE OFF
SELECT
SN.NAME,
SN.STATISTIC#,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SPOOL OFFWhat are the results of the above?
Before the insert:
NAME VALUE
table fetch continue 166
After the insert:
NAME VALUE
table fetch continue 166
After the select:
NAME STATISTIC# VALUE
table fetch continue 252 332 Another test, this time with an average row length of about 12 bytes:
DELETE FROM T4;
COMMIT;
SPOOL C:\TESTME2.TXT
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
INSERT INTO T4 (
COL1,
COL256,
COL257,
COL999)
SELECT
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3)
FROM
DUAL
CONNECT BY
LEVEL<=100000;
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SET AUTOTRACE TRACEONLY STATISTICS
SELECT
FROM
T4;
SET AUTOTRACE OFF
SELECT
SN.NAME,
SN.STATISTIC#,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SPOOL OFFWith 100,000 rows each containing about 12 bytes, what should the 'table fetch continued row' statistic show?
Before the insert:
NAME VALUE
table fetch continue 332
After the insert:
NAME VALUE
table fetch continue 332
After the select:
NAME STATISTIC# VALUE
table fetch continue 252 33695The final test only inserts data into the first 4 columns:
DELETE FROM T4;
COMMIT;
SPOOL C:\TESTME3.TXT
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
INSERT INTO T4 (
COL1,
COL2,
COL3,
COL4)
SELECT
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3),
DBMS_RANDOM.STRING('A',3)
FROM
DUAL
CONNECT BY
LEVEL<=100000;
SELECT
SN.NAME,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SET AUTOTRACE TRACEONLY STATISTICS
SELECT
FROM
T4;
SET AUTOTRACE OFF
SELECT
SN.NAME,
SN.STATISTIC#,
MS.VALUE
FROM
V$MYSTAT MS,
V$STATNAME SN
WHERE
SN.NAME = 'table fetch continued row'
AND SN.STATISTIC#=MS.STATISTIC#;
SPOOL OFFWhat should the 'table fetch continued row' show?
Before the insert:
NAME VALUE
table fetch continue 33695
After the insert:
NAME VALUE
table fetch continue 33695
After the select:
NAME STATISTIC# VALUE
table fetch continue 252 33695 My statement "Tables with more than 255 columns will always have chained rows." needs to be clarified:
"Tables with more than 255 columns will always have chained rows +(row pieces)+ if a column beyond column 255 is used, but the 'table fetch continued row' statistic +may+ only increase in value if the remaining row pieces are found in a different block."
Charles Hooper
IT Manager/Oracle DBA
K&M Machine-Fabricating, Inc.
Edited by: Charles Hooper on Aug 5, 2009 9:52 AM
Paraphrase misspelled the view name "V$SYSSTAT", corrected a couple minor typos, and changed "will" to "may" in the closing paragraph as this appears to be the behavior based on the test case. -
How to create a table with varied number of columns?
I am trying to create a balance table. The colunms should include years between the start year and end year the user will input at run time. The rows will be the customers with outstanding balance in those years.
If the user input years 2000 and 2002, the table should have columns 2000, 2001, 2002. But if the user input 2000 and 2001, the table will only have columns 2000 and 2001.
Can I do it? How? Thanka a lot.Why did you create a new thread for this?
How to create a table with varied number of columns? -
How to create a table with more tan 20 columns
I created a A4 landscape document to make a calendar for the next year.
To add the date information I need to insert a table with 32 columns (name of month + 31 days). It should be possible to create a table with a over all width of 29.7 cm (consisting of 1 column of 4.9 cm and 31 columns with 0.8 cm)?
Remark: Yet reduced all indention's (? German: Einzug) and other spacers and the size of used font of the table.Hello Till,
unfortunatly Pages connot create a table with more than 20 columns. This is no question of size or place.
But you can create two tables, set the most right border of the fist table to "no line" and align the second table (with the other 12 columns) at the right side of the first table. Now you have "one" table with 32 columns.
I have done this with my driving book (Fahrtenbuch, um es korrekt zu schreiben and it works fine.
Frank. -
Hi everyone, again landed up with a problem.
After trying a lot to do it myself, finally decided to post here..
I have created a form in form builder 6i, in which on clicking a button the data gets exported to excel sheet.
It is working fine with a single table. The problem now is that i am unable to do the same with 2 tables.
Because both the tables have same number of columns & column names.
Below are 2 tables with column names:
Table-1 (MONTHLY_PART_1)
Table-2 (MONTHLY_PART_2)
SL_NO
SL_NO
COMP
COMP
DUE_DATE
DUE_DATE
U-1
U-1
U-2
U-2
U-4
U-4
U-20
U-20
U-25
U-25
Since both the tables have same column names, I'm getting the following error :
Error 402 at line 103, column 4
alias required in SELECT list of cursor to avoid duplicate column names.
So How can i export the data to excel which has 2 tables with same number of columns & column names?
Should i paste the code? Should i post this query in 'SQL and PL/SQL' Forum?
Help me with this please.
Thank You.You'll have to *alias* your columns, not prefix it with the table names:
$[CHE_TEST@asterix1_impl] r
1 declare
2 cursor cData is
3 with data as (
4 select 1 id, 'test1' val1, 'a' val2 from dual
5 union all
6 select 1 id, '1test' val1, 'b' val2 from dual
7 union all
8 select 2 id, 'test2' val1, 'a' val2 from dual
9 union all
10 select 2 id, '2test' val1, 'b' val2 from dual
11 )
12 select a.id, b.id, a.val1, b.val1, a.val2, b.val2
13 from data a, data b
14 where a.id = b.id
15 and a.val2 = 'a'
16 and b.val2 = 'b';
17 begin
18 for rData in cData loop
19 null;
20 end loop;
21* end;
for rData in cData loop
ERROR at line 18:
ORA-06550: line 18, column 3:
PLS-00402: alias required in SELECT list of cursor to avoid duplicate column names
ORA-06550: line 18, column 3:
PL/SQL: Statement ignored
$[CHE_TEST@asterix1_impl] r
1 declare
2 cursor cData is
3 with data as (
4 select 1 id, 'test1' val1, 'a' val2 from dual
5 union all
6 select 1 id, '1test' val1, 'b' val2 from dual
7 union all
8 select 2 id, 'test2' val1, 'a' val2 from dual
9 union all
10 select 2 id, '2test' val1, 'b' val2 from dual
11 )
12 select a.id a_id, b.id b_id, a.val1 a_val1, b.val1 b_val1, a.val2 a_val2, b.val2 b_val2
13 from data a, data b
14 where a.id = b.id
15 and a.val2 = 'a'
16 and b.val2 = 'b';
17 begin
18 for rData in cData loop
19 null;
20 end loop;
21* end;
PL/SQL procedure successfully completed.
cheers -
Table with more than 35 columns
Hello All.
How can one work with a table with more than 35 columns
on JDev 9.0.3.3?
My other question is related to this.
Setting Entities's Beans properties from a Session Bean
bought up the error, but when setting from inside the EJB,
the bug stays clear.
Is this right?
Thank youThank you all for reply.
Here's my problem:
I have an AS400/DB2 Database, a huge and an old one.
There is many COBOL Programs used to communicate with this DB.
My project is to transfer the database with the same structure and the same contents to a Linux/ORACLE System.
I will not remake the COBOL Programs. I will use the existing one on the Linux System.
So the tables of the new DB should be the same as the old one.
That’s why I can not make a relational DB. I have to make an exact migration.
Unfortunately I have some tables with more than 5000 COLUMNS.
Now my question is:
can I modify the parameters of the ORACE DB to make it accept Tables and Views with more than 1000 columns, If not, is it possible to make a PL/SQL Function that simulate a table, this function will insert/update/select data from many other small tables (<1000 columns). I want to say a method that make the ORACLE DB acting like if it has a table with a huge number of columns;
I know it's crazy but any idea please. -
Compressed tables with more than 255 columns
hi,
Would anyone have a sql to find out compressed tables with more than 255 columns.
Thank you
JonuSELECT table_name,
Count(column_name)
FROM user_tab_columns utc
WHERE utc.table_name IN (SELECT table_name
FROM user_tables
WHERE compression = 'ENABLED')
HAVING Count(column_name) > 255
GROUP BY table_name -
Can I create a table with more than 40 columns ?
Hello,
I need to organize my work with a table and need about 66 columns. I am not very good with Numbers but I know a little bit of Pages. I cannot find a way to create a table with more than 40 columns... Is it hopeless ? (Pages '08 v. 3.0.3)
Thank you allIt's never too late to try to teach users that the Search feature is not only for helpers.
The number of columns allowed in Numbers is not a relevant value when the question is about Pages '08 tables.
I tried to copy a 256 columns Numbers table into a Pages '09 document.
I didn't got a table but values separated by TABs.
I tried to convert this text range in a table and got this clear message :
If I remember well, in Pages '08, the limit is 40 columns.
It seems that you are a specialist of inaccurate responses, but I'm not a moderator so I'm not allowed to urtge you to double check what you wrote before posting.
Yvan KOENIG (VALLAURIS, France) vendredi 29 avril 2011 14:57:58
Please :
Search for questions similar to your own before submitting them to the community
Maybe you are looking for
-
PI 7.11 - Capacity to handle maximum load/size per transaction
Experts, What is the SAP Benchmark testing for PI 7.11 ..regarding load / huge data, that is what was the maximum size PI 7.11 can handle in one transaction. Any SAP / OSS document Please. Thanks Edward
-
Manipulate the value of User Input Variable
Dear BI fellows, I have a requirement to change the value that the user entered in variable screen (the variable type is a characteristic value variable, w/ processing by: user input), after the user clicks the execute button. I tried creating 2 char
-
Detect click and coordinates on a PDA
Hello everybody, I'm trying to detect a stylus click on a picture and the coordinates of the click on a PocketPC 2002 PDA. In the PC, this works fine with a "mouse down" event for a picture, which detects the click and the coordinates, but this event
-
Exiting the form when a text item has focus
I am a beginner, beginner using developer/2000. I made a form with some text items that are linked to the database. When one of the text items has focus and I click the push button that i created to exit the form, I get the message FIELD MUST BE ENTE
-
Does anyone have experience with Virgin or Boost Mobile?
I have been with verizon for a long time...I have 4 lines with this service. I have since bought several phones off of craigslist or ebay just so I wouldn't have to be subjected to the tier plan. We use so much data it's crazy. I want the new Samsung