Issues while Populating data in Tree Table..
Hi,
I am using Tree Table component to populate Hierarchical data in it.
I create data controls based on web service proxy.
While creating Tree table i selected the Display Attributes to show in Tree table.
Now i am struck with two requirements:
1. At run time when i see the data in Tree table i can see that Display Attributes for mixed.
For Example:
Andrew> Phone
Work 123456789 // Here we can notice that, two attributes data displayed with on space gap. Phone Type and Phone number
Home 987654321 need to show some separation character between thenm something like : Work - 123456789 Is it possible?
> Email
work [email protected]
2. I need to show popup on click on Root node of the Tree table. If we take the above example, i want to perform a click event on "Andrew" so that i can popup and show some details.
But when i try to insert a command link in Node. Parent & Child nodes are populated with command link. How to have command link only for parent Node.
Code i am using:
*<af:treeTable value="#{bindings.contact.treeModel}" var="node"*
*selectionListener="#{bindings.contact.treeModel.makeCurrent}"*
*rowSelection="single"*
*binding="#{backingBeanScope.EditValidationDetails.tt1}"*
*id="tt1" width="920">*
*<f:facet name="nodeStamp">*
*<af:column id="c1" width="800" filterable="true">*
*<af:commandLink text="#{node}" id="cl2"/>*
*</af:column>*
*</f:facet>*
*<f:facet name="pathStamp">*
*<af:outputText value="#{node}"*
*binding="#{backingBeanScope.EditValidationDetails.ot3}"*
*id="ot3"/>*
*</f:facet>*
*</af:treeTable>*
Thanks in Advance...
Regards
Thoom
Hi,
I am using Tree Table component to populate Hierarchical data in it.
I create data controls based on web service proxy.
While creating Tree table i selected the Display Attributes to show in Tree table.
Now i am struck with two requirements:
1. At run time when i see the data in Tree table i can see that Display Attributes for mixed.
For Example:
Andrew
--Phone
----Work 123456789 // Here we can notice that, two attributes data displayed with on space gap. Phone Type and Phone number
----Home 987654321 need to show some separation character between thenm something like : Work - 123456789 Is it possible?
--Email
----work [email protected]
2. I need to show popup on click on Root node of the Tree table. If we take the above example, i want to perform a click event on "Andrew" so that i can popup and show some details.
But when i try to insert a command link in Node. Parent & Child nodes are populated with command link. How to have command link only for parent Node.
Code i am using:
<
<af:treeTable value="#{bindings.contact.treeModel}" var="node"
selectionListener="#{bindings.contact.treeModel.makeCurrent}"
rowSelection="single"
binding="#{backingBeanScope.EditValidationDetails.tt1}"
id="tt1" width="920">
<f:facet name="nodeStamp">
<af:column id="c1" width="800" filterable="true">
<af:commandLink text="#{node}" id="cl2"/>
</af:column>
</f:facet>
<f:facet name="pathStamp">
<af:outputText value="#{node}"
binding="#{backingBeanScope.EditValidationDetails.ot3}"
id="ot3"/>
</f:facet>
</af:treeTable>
>
Thanks in Advance...
Regards
Thoom
Similar Messages
-
Access issues while inserting data in a table in same schema
Hi All.
I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
Error message - "Insufficient Previlages".
Please suggest me what may be the reasons for this kind of error.
Thanks in advanceSonika wrote:
Hi All.
I have a script that at first creates and then populates a table. My script used to run fine in production environment till few hours back. But all of a sudden, it is popping up error while inserting data into the table .
Error message - "Insufficient Previlages".
Please suggest me what may be the reasons for this kind of error.
1) something changed
2) you are hitting a bug -
Performance issues while query data from a table having large records
Hi all,
I have a performance issues on the queries on mtl_transaction_accounts table which has around 48,000,000 rows. One of the query is as below
SQL ID: 98pqcjwuhf0y6 Plan Hash: 3227911261
SELECT SUM (B.BASE_TRANSACTION_VALUE)
FROM
MTL_TRANSACTION_ACCOUNTS B , MTL_PARAMETERS A
WHERE A.ORGANIZATION_ID = B.ORGANIZATION_ID
AND A.ORGANIZATION_ID = :b1
AND B.REFERENCE_ACCOUNT = A.MATERIAL_ACCOUNT
AND B.TRANSACTION_DATE <= LAST_DAY (TO_DATE (:b2 , 'MON-YY' ) )
AND B.ACCOUNTING_LINE_TYPE != 15
call count cpu elapsed disk query current rows
Parse 1 0.00 0.00 0 0 0 0
Execute 3 0.02 0.05 0 0 0 0
Fetch 3 134.74 722.82 847951 1003824 0 2
total 7 134.76 722.87 847951 1003824 0 2
Misses in library cache during parse: 1
Misses in library cache during execute: 2
Optimizer mode: ALL_ROWS
Parsing user id: 193 (APPS)
Number of plan statistics captured: 1
Rows (1st) Rows (avg) Rows (max) Row Source Operation
1 1 1 SORT AGGREGATE (cr=469496 pr=397503 pw=0 time=237575841 us)
788242 788242 788242 NESTED LOOPS (cr=469496 pr=397503 pw=0 time=337519154 us cost=644 size=5920 card=160)
1 1 1 TABLE ACCESS BY INDEX ROWID MTL_PARAMETERS (cr=2 pr=0 pw=0 time=59 us cost=1 size=10 card=1)
1 1 1 INDEX UNIQUE SCAN MTL_PARAMETERS_U1 (cr=1 pr=0 pw=0 time=40 us cost=0 size=0 card=1)(object id 181399)
788242 788242 788242 TABLE ACCESS BY INDEX ROWID MTL_TRANSACTION_ACCOUNTS (cr=469494 pr=397503 pw=0 time=336447304 us cost=643 size=4320 card=160)
8704356 8704356 8704356 INDEX RANGE SCAN MTL_TRANSACTION_ACCOUNTS_N3 (cr=28826 pr=28826 pw=0 time=27109752 us cost=28 size=0 card=7316)(object id 181802)
Rows Execution Plan
0 SELECT STATEMENT MODE: ALL_ROWS
1 SORT (AGGREGATE)
788242 NESTED LOOPS
1 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF
'MTL_PARAMETERS' (TABLE)
1 INDEX MODE: ANALYZED (UNIQUE SCAN) OF
'MTL_PARAMETERS_U1' (INDEX (UNIQUE))
788242 TABLE ACCESS MODE: ANALYZED (BY INDEX ROWID) OF
'MTL_TRANSACTION_ACCOUNTS' (TABLE)
8704356 INDEX MODE: ANALYZED (RANGE SCAN) OF
'MTL_TRANSACTION_ACCOUNTS_N3' (INDEX)
Elapsed times include waiting on following events:
Event waited on Times Max. Wait Total Waited
---------------------------------------- Waited ---------- ------------
row cache lock 29 0.00 0.02
SQL*Net message to client 2 0.00 0.00
db file sequential read 847951 0.40 581.90
latch: object queue header operation 3 0.00 0.00
latch: gc element 14 0.00 0.00
gc cr grant 2-way 3 0.00 0.00
latch: gcs resource hash 1 0.00 0.00
SQL*Net message from client 2 0.00 0.00
gc current block 3-way 1 0.00 0.00
********************************************************************************On a 5 node rac environment the program completes in 15 hours whereas on a single node environemnt the program completes in 2 hours.
Is there any way I can improve the performance of this query?
Regards
Edited by: mhosur on Dec 10, 2012 2:41 AM
Edited by: mhosur on Dec 10, 2012 2:59 AM
Edited by: mhosur on Dec 11, 2012 10:32 PMCREATE INDEX mtl_transaction_accounts_n0
ON mtl_transaction_accounts (
transaction_date
, organization_id
, reference_account
, accounting_line_type
/:p -
Special character issue while loading data from SAP HR through VDS
Hello,
We have a special character issue, while loading data from SAP HR to IdM, using a VDS and following the standard documentation: http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e09fa547-f7c9-2b10-3d9e-da93fd15dca1?quicklink=index&overridelayout=true
French accent like (é,à,è,ù), are correctly loaded but Turkish special ones (like : Ş, İ, ł ) are transformed into u201C#u201D in Idm.
The question is : does someone know any special setting to do in the VDS or in IdM for special characters to solve this issue??
Our SAP HR version is ECC6.0 (ABA/BASIS7.0 SP21, SAP_HR6.0 SP54) and we are using a VDS 7.1 SP5 and SAP NW IdM 7.1 SP5 Patch1 on oracle 10.2.
ThanksWe are importing directly to the HR staging area, using the transactions/programs "HRLDAP_MAP", "LDAP" and "/RPLDAP_EXTRACT", then we have a job which extract data from the staging area to a CSV file.
So before the import, the character appears correctly in SAP HR, but by the time it comes through the VDS to the IDM's temporary table, it becomes "#".
Yes, our data is coming from a Unicode system.
So, could it be a java parameter to change / add in the VDS??
Regards. -
Error while selecting date from external table
Hello all,
I am getting the follwing error while selecting data from external table. Any idea why?
SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
2 SE_CUST_ID NUMBER(38),
3 SE_TRAN_AMT_LCY FLOAT(126),
4 SE_REVERSAL_MARKER VARCHAR2(255))
5 ORGANIZATION EXTERNAL (
6 TYPE ORACLE_LOADER
7 DEFAULT DIRECTORY ext_tables
8 ACCESS PARAMETERS (
9 RECORDS DELIMITED BY NEWLINE
10 FIELDS TERMINATED BY ','
11 MISSING FIELD VALUES ARE NULL
12 (
13 country_code CHAR(5),
14 country_name CHAR(50),
15 country_language CHAR(50)
16 )
17 )
18 LOCATION ('SE2.csv')
19 )
20 PARALLEL 5
21 REJECT LIMIT UNLIMITED;
Table created.
SQL> select * from se2_ext;
SQL> select count(*) from se2_ext;
select count(*) from se2_ext
ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04043: table column not found in external source: SE_REF_NO
ORA-06512: at "SYS.ORACLE_LOADER", line 19It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
HTH -- Mark D Powell -- -
Error while activating data from new table of DSO to active table
HI,
while activating data from new table of DSO to active table i am getting
error message as "Error occurred while deciding partition number".
Plz any idea hoe to resolve this one.
thanks & regards
KPS MOORTHYHi
You are trying to update/upload the Records which are already been there in the DSO Active Data Table which has the partition.
Try to see the Record Nos already been activated and update/upload with selective, if possible.
You can trace the changes at Change log table for the same.
Hope it helps
Edited by: Aduri on Jan 21, 2009 10:38 AM -
How to use for all entires clause while fetching data from archived tables
How to use for all entires clause while fetching data from archived tables using the FM
/PBS/SELECT_INTO_TABLE' .
I need to fetch data from an Archived table for all the entries in an internal table.
Kindly provide some inputs for the same.
thanks n Regards
RameshHi Ramesh,
I have a query regarding accessing archived data through PBS.
I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
The call to the above FM is as follows :
CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
EXPORTING
archiv = 'CFI'
OPTION = ''
tabname = 'BKPF'
SCHL1_NAME = 'BELNR'
SCHL1_VON = belnr-low
SCHL1_BIS = belnr-low
SCHL2_NAME = 'GJAHR'
SCHL2_VON = GJAHR-LOW
SCHL2_BIS = GJAHR-LOW
SCHL3_NAME = 'BUKRS'
SCHL3_VON = bukrs-low
SCHL3_BIS = bukrs-low
SCHL4_NAME =
SCHL4_VON =
SCHL4_BIS =
CLR_ITAB = 'X'
MAX_ZAHL =
tables
i_tabelle = t_bkpf
SCHL1_IN =
SCHL2_IN =
SCHL3_IN =
SCHL4_IN =
EXCEPTIONS
EOF = 1
OTHERS = 2
OTHERS = 3
It gives me the following error :
Index for table not supported ! BKPF BELNR.
Please help ASAP.
Thnaks and Regards
Gurpreet Singh -
How can we improve the performance while fetching data from RESB table.
Hi All,
Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
SELECT aufnr posnr roms1 roanz
INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
Here I am using 'KDAUF' & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
Regards,
HimanshuHi ,
Declare intenal table with only those four fields.
and try the beloe code....
SELECT aufnr posnr roms1 roanz
INTO table itab
FROM resb
WHERE kdauf = p_vbeln
AND ablad = itab-sposnr+2.
yes, you can also use secondary index for improving the performance in this case.
Regards,
Anand .
Reward if it is useful.... -
Issue While sending data to Webserver
Hi All,
I am facing an issue while sending data to a webserver. My scenario is SAP(Proxy) to XI to Webserver. It is an asynchronous scenario. The same communication channel and same WSDL is used to send many Interfaces but when I am sending data for a particular Interface I am facing the issue "SOAP: response message contains an error XIAdapter/HTTP/ADAPTER.HTTP_EXCEPTION - HTTP 400 Bad Request"
Few points for reference -
1. I am sending valid data to the webserver(Confirmed by the webserver guy).
2. The WSDL is the latest one and the same WSDL is used at their end to receive the data.
3. The schema I am using to send the data is also the latest ones.
4. The same payload when sent through SOAPUI tool reaches the web server.
5. The same communcation channel is used to send data for all the messages. All other messages are reaching the webserver except this message.
Can anyone please provide me some solution for this issue.
Thanks in advance.
Regards,
SaratSInce you are using only one receiver channel for all WSDL operations, I hope you are setting SOAP action for each message type (operation) using dynamic configuration during mapping or at module level(in the soap receiver channel using DC bean).
>> 5. The same communcation channel is used to send data for all the messages. All other messages are reaching the webserver except this message.
Cross verify again all the design and configurations. check the MONI for Dynamic configuration header for this particular message if you are setting the same during mapping.
>> 4. The same payload when sent through SOAPUI tool reaches the web server.
Finally you can use some Sniffer tools to verify the requests that are being sent from XI and you can compare with SOAP UI tool request which is sending correct request in your case. -
Error while inserting data into a table.
Hi All,
I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
Thanx in advance
anirudhHi Anirudh,
Seems there is already an entry in the Table with the same Primary Key.
INSERT Statement will give short dump if you try to insert data with same key.
Why dont you use MODIFY statement to achieve the same.
Reward points if this Helps.
Manish -
Performance issue while extracting data from non-APPS schema tables.
Hi,
I have developed an ODBC application to extract data from Oracle E-Business Suite 12 tables.
e.g. I am trying to extract data from HZ_IMP_PARTIES_INT table of Receivables application(Table is in "AR" database schema) using this ODBC application.
The performance of extraction (i.e. number of rows extracted per second) is very low if the table belongs to non-APPS schema. (e.g. HZ_IMP_PARTIES_INT belongs to "AR" database schema)
Now if I create same table (HZ_IMP_PARTIES_INT) with same data in "APPS" schema, the performance of extraction improves a lot. (i.e. the number of rows extracted per second increases a lot) with the same ODBC application.
The "APPS" user is used to connect to the database in both scenarios.
Also note that my ODBC application creates multiple threads and each thread creates its own connection to the database. Each thread extract different data using SQL filter condition.
So, my question is:
Is APPS schema optimized for any data extraction?
I will really appreciate any pointer on this.
Thanks,
Rohit.Hi,
Is APPS schema optimized for any data extraction? I would say NO as data extraction performance should be the same for all schemas. Do you run "Gather Schema Statistics" concurrent program for ALL schemas on regular basis?
Regards,
Hussein -
Issue while checking data in table
i have writtten aprogram to check the data in the table. program is like
REPORT ZTEST3.
tables zempdata.
data wa_zempdata type zempdata.
loop at zempdata into wa_zempdata.
WRITE: / 'Runtime 1', wa_zempdata-employee_number.
WRITE: / 'Runtime 2', wa_zempdata-employee_name.
ENDSELECT.
its giving me error saying VERSION expected after zempdata.
is there any way to resolve it or any other simple way to check the dat in the table through report program as i don't have acces on se16HI,
Try with the below code
DATA : it_empdata TYPE TABLE OF zempdata,
wa_empdata TYPE zempdata.
SELECT * FROM zempdata INTO TABLE it_empdata.
LOOP AT it_empdata INTO wa_empdata.
WRITE :/ wa_zempdata-employee_number, wa_zempdata-employee_name.
ENDLOOP.
Regards
Bala Krishna -
Issue while uploading data from flatfile to Transaction(VK13).
Hi All,
I am facing an issue while uploading the data from flatfile to the transaction(vk13). My flat file is as shown below.
SalesOrganization DistributionChannel Customer Material Releasestatus Amount Currency Quantity UOM ValidFrom ValidTo
2000 01 0010029614 AT309739 A 20.00 USD 1 PC 09/11/2014 12/31/9999
If I upload these data using the RV_CONDITION_COPY FM it is succesfully uploading to the relevant tables(konh,konp) but in the tcode VK13 I am getting all values fine, except UOM. Instead of PC it is showning as ***. I did not understand why it is happening please give an idea to solve my issue.
Regards,
Chakradhar.Hi Raymond,
Thanks for your reply.Yes,If I use CONVERSION_EXIT_CUNIT_INPUT in my program the issue is, Assume If the user is giving PC as value for UOM field in flat file and upload the flat file.It is successfully uploading the value PC to the UOM field in transaction VK13 but the in the database table(konp) it is showing the value as ST.
Regards,
Chakradhar. -
Deadlock issue while sending data from PI to JDBC !!!
Hi All,
We are getting below error while sending data from PI to JDBC (Database).
Error - JDBC message processing failed; reason Error processing request in sax parser: Error when executing statement for table/stored proc. 'store_order_details' (structure 'Order_details_Statement'): com.microsoft.sqlserver.jdbc.SQLServerException: Transaction (Process ID 61) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
This is happening when more than one PI message is trying to deliver data at the same time to database.
Can anyone let us know the root cause of this issue and how we can rectify this.
Is there some setting which we can use to do parallel processing as Delivering one message at a time to database is causing performance issues & taking lot of time.
Thanks
Neha VermaHello Neha,
Strange but can please get below information.
Please check with the DB admin about if the user is getting locked or is there any hanging threads related to user.
Also confirm with DB admin if the exclusive lock is on table or on the row when you try insertign or updating information.
You can share the user from the receiver channel.
Regards,
Hiren A. -
Issue while accessing a SQL Server table over OTG
Hi,
I have been learning oracle for about 1.5 years and am just starting to learn some OTG pieces. I am wondering about an issue. The issue is:
"We need help with an issue we are having while accessing a SQL Server table over OTG. We are getting the following error message in Oracle :
ORA-28500: connection from ORACLE to a non-Oracle system returned this message:
[Oracle][ODBC SQL Server Driver]Unicode conversion failed {HY000}
The column it is failing on is "-----------" in the view --------------- in the SQL Server database pointed to by the Oracle DB Link ------------------- thats created in the Oracle instances ---- and -----.
This was working before, but is now failing, we suspect its due to new multi-byte data being added to the base table in the above column."
I took out the details and added ---- instead. I am wondering your guys thoughts on fixing this issue and helping me learn along the way. ThanksHi Mike,
Thanks for the response, here are the details:
1. What is the character set of the Oracle RDBMS being used. What is returned by -
select * from nls_database_parameters;
NLS_CHARACTERSET
AL32UTF8
NLS_NCHAR_CHARACTERSET
UTF8
We get SQL_Latin1_General_CP1_C1_AS and 1252 as Collation Property and Code Page
The datatype of the column in question in SQL Server is nvarchar(100).
When I do a describe on the SQL Server view ( desc CK_DATA_FOR_OPL@------- ), I get the error below;
ERROR: object CK_DATA_FOR_OPL does not exist
Select * from CK_DATA_FOR_OPL@------ where rownum =1 does get me a row.
create table tmp_tab as
Select * from CK_DATA_FOR_OPL@----- where rownum =1;
desc tmp_tab shows the datatype of the said column in the table created in Oracle as NVARCHAR2(150).
Not sure why a column defined with size 100 in SQL Server should come across as 150 when seen over OTG. We see something similar in DB2 tables we access over OTG as well.
Edited by: 993950 on Mar 15, 2013 8:49 AM
Maybe you are looking for
-
I am having many problems with Firefox 5 how do I go back to Firefox 4
Whenever I log on to mymsn via Firefox I get this error message: TpyeError: Components. classes[cid] is undefined. Also I used to be able to type in an address or select one from the drop down menu and it would go to the site now I have to click the
-
I've tried doing a search but not really finding anything specific. Is the Mac OS based off of Linux or Unix? At the OS level it appears exactly like Linux other than the directory structures and such. Is it based off Linux or is it a completely sepa
-
EA3 Why does unit test end with ORA-00904
Database version Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options SQL Developer version Oracle IDE 4.0.0.12.27 Java(TM) Platform 1.7.0_25 Error
-
Hi, I want to have a loop in message mapping. For example: Customers Name (unbounded) ID GivenName Adress (unbounded) ID City ID is the key which determines which Given Name belongs to which city. That should be transformed to Cus
-
Exchange student interested in iPhone 4s
We are hosting an exchange student from Norway and she would like to purchase an iPhone 4s. What is the best way to go about that and still be able to take it back home with her and use it? Thanks for your help!