XMLP in 11i - same data def, 1 to PDF and 1 template to eText
Must be missing something but it seems I am unable to have 2 templates un XML Admin point to the same Oracle executable (in my case the code is OEXOEACK.
What is a good way around this?
Hi,
Each template definition can only hold one template (eg rtf file) per language and territory.
However you can create more that one template definition against a single data definition.
Hope this makes sense.
Regards
Paul
Similar Messages
-
I don't even know if this is posible.
I'm using iwork '09
View PDFI hope I can clarify:
For our purposes here:
Table 1 = "Step 2 - Product Sizes"
Table 2 = "Option id Master"
Table 3 = "Export - Product Info"
Table 1:
The user would enter values for "productcode," "Horz," and "Vert"
"Size" would auto fill based on values in Horiz and Vert (I have this taken care of already).
Table 2: This is a completely static table that I want to search against. - Data from other tables in the doc does not alter or change the data in this doc.
We just want to look at table 2. Find the existing value in "table 2 : size" column that matches the "table 1 : size" column and then pull the "optionids" and "productprice" from that row.
Can the value from "Table 1 : Size" be used as a search term in "Table 2 : Size?"
Table 3: The user does not enter any values on this table.
"productcode" is pulled from table 1 - "Table 1 :: A5" = "Table 3 :: A5"
"optionids" and "productprice" are pulled from Table 2 columns "D" and "E" - however we do not know which Table 2 row it is pulling from until we enter data in Table 1.
As I'm writing this I'm realizing that
A. this is probably really confusing to you.
B. this may be impossible inside of numbers.
If you have some other method that would facilitate the same out come but be structured differently please let me know.
--- maybe to help you understand further what I am doing here is my current workflow:
I record the size of a piece of art.
Then I manually go to my "Option id Master" and find the same size.
I then copy the corresponding "optionids" and "productprice" cells. (these options control the prices displayed on my website)
I got to my "Export - Product Info" table and paste the values in the corresponding cells.
I was hoping to automate this as it takes a long time when you have hundreds of products.
Thanks for the help! -
Problem with different resultset with same data and same query in Oracle 8.1.7 and 9i
Hello,
I have been using this query in oracle 8.1.7
SELECT
ID,
AREA_NO
FROM MANAGER_AREA MGR
WHERE COMPANY_ID = :id AND
(:value < (SELECT COUNT(ROWID)
FROM MANAGER_WORK MW
WHERE MW.AREA_ID = MGR.ID AND
(MW.END_WORK IS NULL OR MW.END_WORK >= SYSDATE)))
order by AREA_NO;
In the above query I want to see rows from MANAGER_AREA table depending upon date criteria in the table MANAGER_WORK and also upon the parameter :value i.e if I pass a value as 0 I get to see records for which their is atleast 1 record in MANAGER_WORK with the date criteria and if I pass -1 then I get all the records because minimum value that count(*) would give is 0. The resultset was as expected in 8.1.7.
A couple of days back I installed PERSONAL 9i to test for testing the basic functionality of our program with the same data. This query fails and irrespective whether I pass -1 or 0 it returns me same dataset which I would have got in case if I pass 0.
I do not know whether this is a bug that has got introduced in 9i. Can anybody help me with this problem. It would be difficult for me to change the parameter send to this query as the Query is called from many different places.
Thanks in advance
Amol.I cannot use a Group by and a having statement over here. The problem with 'Group by' and 'having' clause is If I have to make a join between the two tables. When I use join then I get only rows that are linked to each other in the table.
If I use outer join to solve that problem then I have to take in consideration the other date condition. My previous query use to virtually discard the corelated query result by using -1 as the value. This will not happen in the join query.
Amol. -
I want to upgrade iOS from 5 to 6, but when I go to synch everything in preparation, it tells me I will lose all my stuff on the iPhone and iPad which is/are unique. How can I backup or do a "real synch" (both sides have same data at end of process) and then upgrade?
You should really read the manual.
"How do you restore from backup? "
Restore. When given the choice, choose to use backup.
"And how can I check to see if the pics and videos are on my computer somewhere first??"
They would only be where you put them. What program did you use to import them? Pics/vids taken with ipod are not part of the sync process at all. You should be importing them just as you would with any digital camera.
If you did not import them, then they are not on your computer. -
Multiple Purchase Requisition generate in the same date WHY ?
Hi All,
In MD04 Tcode for one Material No , I can see multiple PR getting generated for the same Date !! I would like to know based on what settings same date is appearing ?? and also please tell how to combine all this PR into one PR !!! Please guide me
Thanks & Regards,
Vinoth.Hi,
Che on it:-
Re: Third party Drop Ship
Re: Single PO for Multiple PR
Re: MD04 + Auto PR generation
pherasath -
Why will my web form submit but my fillable form not submit with the same data in both?
i have developed a form in formscentral. When I trest it as a web form, it submits all the data to my attached fromscentral response form. I copy and paste the same data into tjhe fillable pdf form I generate from the distribute forms option with a submit buttion, and the form will not submit. It gives an "unable to connect to server" error.
When I try an empty form. it will submit. When I delete data from text fields. I will eventually reach a pint where the form will submit.
Is there a file size limit on the fillable pdf for submission to the formscentral response form?Hi Mark,
Sorry for not being clear.
There is a system limit of 10000 characters in a multiline text field. Say you designed a form where a multiline text field has a limit of 20 characters. The following text will submit without error in the HTML form but will fail in the PDF form:
1234567890
123456789
In the HTML form there is a total of 20 characters in the field value, 19 digits and a linefeed character representing the newline. In the PDF form there is a total of 21 characters in the field value, 19 digits, a carriage return, and a linefeed.
Regards,
Brian -
Upload PDF and the same shuld available
Hi All ,
My requirement is i have to upload a PDF file into SAP and that shuld be available in selection screen of right side and left side you will have some fields . By seeing the file the user will enter values in the selection screen fields and then report will executes ....
i know we can upload a PDF by 'GUI_UPLOAD' in 'BIN' filetype and now i have to display the same in the selection screen .... how to do this one ??
Please help me on this ....
Thanks in advance.
Regards,
G. Anil kumar .The problem is solved.
byte[] filebytearray = wdContext.currentPDFFileUploadElement().getFileData();
WDInteractiveFormHelper.transferPDFDataIntoContext(filebytearray, wdContext.nodeCustomerUpload());
This wdContext.nodeCustomerUpload must be exactly the same data node when the PDF file was created. -
Ldt files for different templates have same data
i was asked to get .ldt files for different templates. I used FNDLOAD to download the files using xdotmpl.lct file. all the .ldt file when open in text has the same data. Am I doing something wrong. If the files are same why do we need seperate .ldt file for each template.
But if you developing something and you are going to version it, then
In case of single LDT
==================
Its easy as you said, we can down load all def into one file.
But, if one of the template def. got changed, all data gets downloaded unnecessarily .
Single file , so you can say its easy .
In case of multiple LDT
==================
But, if one of the template def. got changed, then download that one only into ldt.
Multiple file , little more effort is needed to create them.
But more easy to maintain, -
Error while adding a second template to the same data definition
Hi,
I am quite new to XML Publisher. I am developing a report using XML Publisher.
I have added one template to the Data Definition. When I tried to add another Template(with different name as previous, to the same concurrent program and application) to the same Data Definition I am getting the following error.
Error: You entered a duplicate template entry. Please change the Name, Code and/or Application and reapply.
Can anyone help me out solving this problem. I am reaching my deadline. Please suggest me what can be done.
Regards,
SitaHi
Try to change the Template code name also and then check it.
Thanks
Sudheer -
CBO generating different plans for the same data in similar Environments
Hi All
I have been trying to compare an SQL from 2 different but similar environments build of the same hardware specs .The issue I am facing is environment A, the query executes in less than 2 minutes with plan mostly showing full table scans and hash join whereas in environment B(problematic), it times out after 2 hours with an error of unable to extend table space . The statistics are up to date in both environments for both tables and indexes . System parameters are exactly similar(default oracle for except for multiblock_read_count ).
Both Environment have same db parameter for db_file_multiblock_read_count(16), optimizer(refer below),hash_area_size (131072),pga_aggregate_target(1G),db_block_size(8192) etc . SREADTIM, MREADTIM, CPUSPEED, MBRC are all null in aux_stats in both environment because workload was never collected i believe.
Attached is details about the SQL with table stats, SQL and index stats my main concern is CBO generating different plans for the similar data and statistics and same hardware and software specs. Is there any thing else I should consider .I generally see environment B being very slow and always plans tend to nested loops and index scan whereas what we really need is a sensible FTS in many cases. One of the very surprising thing is METER_CONFIG_HEADER below which has just 80 blocks of data is being asked for index scan.
show parameter optimizer
optimizer_dynamic_sampling integer 2
optimizer_features_enable string 10.2.0.4
optimizer_index_caching integer 0
optimizer_index_cost_adj integer 100
optimizer_mode string ALL_ROWS
optimizer_secure_view_merging boolean TRUE
**Environment**
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Solaris: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
Note: : There are slight difference in the no of records in the attached sheet.However, I wanted to tell that i have tested with exact same data and was getting similar results but I couldn't retain the data untill collecting the details in the attachment
TEST1 COMPARE TABLE LEVE STATS used by CBO
ENVIRONMENT A
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 3607425 167760 5/02/2013 22:11
METER_CONFIG_HEADER 3658 80 5/01/2013 0:07
METER_CONFIG_ITEM 32310 496 5/01/2013 0:07
NMI 1899024 33557 18/02/2013 10:55
REGISTER 4830153 101504 18/02/2013 9:57
SDP_LOGICAL_ASSET 1607456 19137 18/02/2013 15:48
SDP_LOGICAL_REGISTER 5110781 78691 18/02/2013 9:56
SERVICE_DELIVERY_POINT 1425890 42468 18/02/2013 13:54
ENVIRONMENT B
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 4133939 198570 16/02/2013 10:02
METER_CONFIG_HEADER 3779 80 16/02/2013 10:55
METER_CONFIG_ITEM 33720 510 16/02/2013 10:55
NMI 1969000 33113 16/02/2013 10:58
REGISTER 5837874 120104 16/02/2013 11:05
SDP_LOGICAL_ASSET 1788152 22325 16/02/2013 11:06
SDP_LOGICAL_REGISTER 6101934 91088 16/02/2013 11:07
SERVICE_DELIVERY_POINT 1447589 43804 16/02/2013 11:11
TEST ITEM 2 COMPARE INDEX STATS used by CBO
ENVIRONMENT A
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 9878 67 147 12982 869801 3553095
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 7291 2747 2 639 1755977 3597916
ASSET SYS_C00102592 UNIQUE 2 12488 3733831 1 1 3726639 3733831
METER_CONFIG_HEADER SYS_C0092052 UNIQUE 1 12 3670 1 1 3590 3670
METER_CONFIG_ITEM SYS_C0092074 UNIQUE 1 104 32310 1 1 32132 32310
NMI IDX_NMI_ID NONUNIQUE 2 6298 844853 1 2 1964769 1965029
NMI IDX_NMI_ID_NK NONUNIQUE 2 6701 1923072 1 1 1922831 1923084
NMI IDX_NMI_STATS NONUNIQUE 1 106 4 26 52 211 211
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 12498 795 15 2899 2304831 4711808
REGISTER SYS_C00102653 UNIQUE 2 16942 5065660 1 1 5056855 5065660
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 3667 1607968 1 1 1607689 1607982
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 3811 668727 1 2 1606204 1607982
SDP_LOGICAL_ASSET SYS_C00102665 UNIQUE 2 5116 1529606 1 1 1528136 1529606
SDP_LOGICAL_REGISTER SYS_C00102677 UNIQUE 2 17370 5193638 1 1 5193623 5193638
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 4406 676523 1 2 1423247 1425890
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 7374 676523 1 2 1458238 1461108
SERVICE_DELIVERY_POINT SYS_C00102687 UNIQUE 2 4737 1416207 1 1 1415022 1416207
ENVIRONMENT B
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 8606 121 71 16428 1987833 4162257
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 8432 1780146 1 1 2048170 4162257
ASSET SYS_C00116157 UNIQUE 2 13597 4162263 1 1 4158759 4162263
METER_CONFIG_HEADER SYS_C00116570 UNIQUE 1 12 3779 1 1 3734 3779
METER_CONFIG_ITEM SYS_C00116592 UNIQUE 1 107 33720 1 1 33459 33720
NMI IDX_NMI_ID NONUNIQUE 2 6319 683370 1 2 1970460 1971313
NMI IDX_NMI_ID_NK NONUNIQUE 2 6597 1971293 1 1 1970771 1971313
NMI IDX_NMI_STATS NONUNIQUE 1 98 48 2 4 196 196
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 15615 1273 12 2109 2685924 5886582
REGISTER SYS_C00116748 UNIQUE 2 19533 5886582 1 1 5845565 5886582
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 4111 1795084 1 1 1758441 1795130
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 4003 674249 1 2 1787987 1795130
SDP_LOGICAL_ASSET SYS_C004520 UNIQUE 2 5864 1795130 1 1 1782147 1795130
SDP_LOGICAL_REGISTER SYS_C004539 UNIQUE 2 20413 6152850 1 1 6073059 6152850
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 3227 660649 1 2 1422572 1447803
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 6399 646257 1 2 1346948 1349993
SERVICE_DELIVERY_POINT SYS_C00128706 UNIQUE 2 4643 1447946 1 1 1442796 1447946
TEST ITEM 3 COMPARE PLANS
ENVIRONMENT A
Plan hash value: 4109575732
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 13 | 2067 | | 135K (2)| 00:27:05 |
| 1 | HASH UNIQUE | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 2 | HASH JOIN | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 3 | HASH JOIN | | 6 | 900 | | 135K (2)| 00:27:04 |
|* 4 | HASH JOIN ANTI | | 1 | 137 | | 135K (2)| 00:27:03 |
|* 5 | TABLE ACCESS BY INDEX ROWID| NMI | 1 | 22 | | 5 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | | 95137 (2)| 00:19:02 |
|* 7 | HASH JOIN | | 1 | 109 | | 95132 (2)| 00:19:02 |
|* 8 | TABLE ACCESS FULL | ASSET | 36074 | 1021K| | 38553 (2)| 00:07:43 |
|* 9 | HASH JOIN | | 90361 | 7059K| 4040K| 56578 (2)| 00:11:19 |
|* 10 | HASH JOIN | | 52977 | 3414K| 2248K| 50654 (2)| 00:10:08 |
|* 11 | HASH JOIN | | 39674 | 1782K| | 40101 (2)| 00:08:02 |
|* 12 | TABLE ACCESS FULL | REGISTER | 39439 | 1232K| | 22584 (2)| 00:04:32 |
|* 13 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 675K| 12M| | 9412 (2)| 00:01:53 |
|* 15 | TABLE ACCESS FULL | SDP_LOGICAL_ASSET | 1178K| 15M| | 4262 (2)| 00:00:52 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | | 2 (0)| 00:00:01 |
| 17 | VIEW | | 39674 | 232K| | 40101 (2)| 00:08:02 |
|* 18 | HASH JOIN | | 39674 | 1046K| | 40101 (2)| 00:08:02 |
|* 19 | TABLE ACCESS FULL | REGISTER | 39439 | 500K| | 22584 (2)| 00:04:32 |
|* 20 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 21 | TABLE ACCESS FULL | METER_CONFIG_HEADER | 3658 | 47554 | | 19 (0)| 00:00:01 |
|* 22 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 7590 | 68310 | | 112 (2)| 00:00:02 |
Predicate Information (identified by operation id):
2 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
3 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
4 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
5 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
7 - access("ASSET_CD"="EQUIP_CD" AND "SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
8 - filter("ROW_CURRENT_IND"='Y')
9 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
10 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
11 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
12 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='4' OR
SUBSTR("REGISTER_ID_CD",1,1)='5' OR SUBSTR("REGISTER_ID_CD",1,1)='6') AND "ROW_CURRENT_IND"='Y')
13 - filter("ROW_CURRENT_IND"='Y')
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y')
16 - access("NMI_SK"="NMI_SK")
18 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
19 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='1' OR
SUBSTR("REGISTER_ID_CD",1,1)='2' OR SUBSTR("REGISTER_ID_CD",1,1)='3') AND "ROW_CURRENT_IND"='Y')
20 - filter("ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
22 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
ENVIRONMENT B
Plan hash value: 2826260434
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 181 | 103K (2)| 00:20:47 |
| 1 | HASH UNIQUE | | 1 | 181 | 103K (2)| 00:20:47 |
|* 2 | HASH JOIN ANTI | | 1 | 181 | 103K (2)| 00:20:47 |
|* 3 | HASH JOIN | | 1 | 176 | 56855 (2)| 00:11:23 |
|* 4 | HASH JOIN | | 1 | 163 | 36577 (2)| 00:07:19 |
|* 5 | TABLE ACCESS BY INDEX ROWID | ASSET | 1 | 44 | 4 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | 9834 (2)| 00:01:59 |
| 7 | NESTED LOOPS | | 1 | 87 | 9830 (2)| 00:01:58 |
| 8 | NESTED LOOPS | | 1 | 74 | 9825 (2)| 00:01:58 |
|* 9 | HASH JOIN | | 1 | 52 | 9820 (2)| 00:01:58 |
|* 10 | TABLE ACCESS BY INDEX ROWID| METER_CONFIG_HEADER | 1 | 14 | 1 (0)| 00:00:01 |
| 11 | NESTED LOOPS | | 1 | 33 | 116 (2)| 00:00:02 |
|* 12 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 1 | 19 | 115 (2)| 00:00:02 |
|* 13 | INDEX RANGE SCAN | SYS_C00116570 | 1 | | 1 (0)| 00:00:01 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 723K| 13M| 9699 (2)| 00:01:57 |
|* 15 | TABLE ACCESS BY INDEX ROWID | NMI | 1 | 22 | 5 (0)| 00:00:01 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | 2 (0)| 00:00:01 |
|* 17 | TABLE ACCESS BY INDEX ROWID | SDP_LOGICAL_ASSET | 1 | 13 | 5 (0)| 00:00:01 |
|* 18 | INDEX RANGE SCAN | IDX_SLA_SDP_SK | 2 | | 2 (0)| 00:00:01 |
|* 19 | INDEX RANGE SCAN | IDX_A_SAPINTLOGDEV_SK | 2 | | 2 (0)| 00:00:01 |
|* 20 | TABLE ACCESS FULL | REGISTER | 76113 | 2378K| 26743 (2)| 00:05:21 |
|* 21 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
| 22 | VIEW | | 90889 | 443K| 47021 (2)| 00:09:25 |
|* 23 | HASH JOIN | | 90889 | 2307K| 47021 (2)| 00:09:25 |
|* 24 | TABLE ACCESS FULL | REGISTER | 76113 | 966K| 26743 (2)| 00:05:21 |
|* 25 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
Predicate Information (identified by operation id):
2 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
3 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK" AND
"SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
4 - access("ASSET_CD"="EQUIP_CD")
5 - filter("ROW_CURRENT_IND"='Y')
9 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
10 - filter("ROW_CURRENT_IND"='Y')
12 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
13 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
16 - access("NMI_SK"="NMI_SK")
17 - filter("ROW_CURRENT_IND"='Y')
18 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
19 - access("SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
20 - filter((SUBSTR("REGISTER_ID_CD",1,1)='4' OR SUBSTR("REGISTER_ID_CD",1,1)='5' OR
SUBSTR("REGISTER_ID_CD",1,1)='6') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
23 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
24 - filter((SUBSTR("REGISTER_ID_CD",1,1)='1' OR SUBSTR("REGISTER_ID_CD",1,1)='2' OR
SUBSTR("REGISTER_ID_CD",1,1)='3') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
25 - filter("ROW_CURRENT_IND"='Y')Edited by: abhilash173 on Feb 24, 2013 9:16 PM
Edited by: abhilash173 on Feb 24, 2013 9:18 PMHi Paul,
I misread your question initially .The system stats are outdated in both ( same result as seen from aux_stats) .I am not a DBA and do not have access to gather system stats fresh.
select * from sys.aux_stats$
SNAME PNAME PVAL1 PVAL2
SYSSTATS_INFO STATUS NULL COMPLETED
SYSSTATS_INFO DSTART NULL 02-16-2011 15:24
SYSSTATS_INFO DSTOP NULL 02-16-2011 15:24
SYSSTATS_INFO FLAGS 1 NULL
SYSSTATS_MAIN CPUSPEEDNW 1321.20523 NULL
SYSSTATS_MAIN IOSEEKTIM 10 NULL
SYSSTATS_MAIN IOTFRSPEED 4096 NULL
SYSSTATS_MAIN SREADTIM NULL NULL
SYSSTATS_MAIN MREADTIM NULL NULL
SYSSTATS_MAIN CPUSPEED NULL NULL
SYSSTATS_MAIN MBRC NULL NULL
SYSSTATS_MAIN MAXTHR NULL NULL
SYSSTATS_MAIN SLAVETHR NULL NULL -
G/L A/C OUTSTANDING SHOWING CLEARED IN ALL ITEMS ON SAME DATE
Dear Expert
G/L A/C outstanding documents are showing cleared transactions in all items category
using the same date in FBL1N.
To illustrate in detail when we click on the radio button with
open item with date 17.11.2009 i m getting certain uncleared
documents, now when I am using the radio button with all items with
same date i m getting all those docs which were uncleared showing
cleared.While in the database documents are showing in BSAK table only.
What are the causes for the same .
Thanks - ViralThe items that you are questioning most likely were cleared after 17.11.2009. When you run FBL1N for open items with key date 17.11.2009, SAP will show you all items that were open as of that date - even if the items have since been cleared. When you run FBL1N for all items, their clearing status is displayed as of the current date. For example, if an item was cleared on 18.11.2009, then it was open as of 17.11.2009, but is cleared as of today.
Regards,
Shannon -
Comapring two Calender Objects for same Date object
hi,
I am having problem with Comparing two Calender Objects which is having same Date value but time stamp is not same.when i printed as Calender.getTime()---->i found them as.
Cal1----->Wed Feb 28 20:26:29 IST 2007
Cal2------>Wed Feb 28 00:00:00 IST 2007
Now the method:
While(Cal1.getTime().compareTo(Cal2.getTime()) ==0 ) is Evaluating false.
i dont want to be concerned with time stamp.i want condition to be true.
Please help me .thx in advance.
Cheers
AkashMy comment was directed at the fact that you gave the same answer I did, only faster. There is no other way I am aware of that will be any quicker. Of course, that is looking at a small piece of code out of context, so there may well be a better way.
~Tim
Message was edited by:
SomeoneElse -
Same set of Records not in the same Data package of the extractor
Hi All,
I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
For eg.
I have a set of records with same company code, same contract same delivery leg and different pricing leg.
If delivery leg and pricing leg is 1 then I want to add one line of record.
There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
My concern is
if the same set of records over shoot the datapackage size how to handle this. Is there any option.
My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
Anil.
Edited by: Anil on Aug 29, 2010 5:56 AMHi,
You will have to do the enhancement of the data source.
Please follow the below link.
You can write your logic to add the additional records in the case statement for your data source.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
Hope this will solve your issue. -
Regarding multiple schedule lines in sales order with same date
Hi All,
we create the Orders from RFC and the orders are going to Multiple Schedule lines for the same date.
period delivery date ordered quan confirmed quantity
D 10/11/2007 10 0
D 10/11/2007 0 10
here for the same date in first line it is not confirming and in the second line it is confirming the quantity for the same date.
On what conditions it is possible??????
Any one have the solution for this Please send it to me [email protected]
Regards,
PrasadJust check the schedule line details i.e go to schedule line---shipping. You will find that although the del. date is same in both the case but the timings were different e.g material availability time etc will be different in both these lines. So though the dates may be same its time difference that's causing the second schedule line.
Reward points if useful -
Multiple MM schedule lines for the same date
Hello All Gurus,
I am facing a problem where the MM schedule lines are divided into various quantites for the same date. For example on 02/11/2010 the total quantity is 120, it is appearing in the below form in ME38:
02/11/2010 - 40
02/11/2010 - 40
02/11/2010 - 40
Although this is not an error, but my client is not happy with this. Can someone tell me why I am getting this error? Is it because of the rounding value that we maintain in MRP 1 view of material master record? Or is there any other setting that I am missing?
Please let me know. I would be very much thankful to you.
Thanks and Regards,
UmakanthHi,
Before changing the setup, note that sometimes it is beneficial to split one demand into multiple delivery schedules. Especially if the demand is bigger than equipment type (container/truck etc.). If you want to manage inbound transportation for such delivery schedules, it is much simpler to do this when each delivery schedule is not bigger than one equipment.
Regards,
Dominik Modrzejewski
Maybe you are looking for
-
OpenSAP is not working in my iPad
OpenSAP is not working in my iPad, can some one please help me. Thx VImal
-
Generic Object Services (GOS) and workflow overview
Hi, I have developed a custom WF for posted FI documents. I use a custom BO and now user wants to access WF log through GOS. Do you know what should I do to link GOS WF overview button to see my custom WF log? Thanks!
-
I am a Boot Camp Beta user with Windows XP. I upgraded to Leopard and installed the new drivers from the Leopard disc. Everything seemed fine, except for a pop-up window saying that new hardware had a problem and might not work correctly. But the Boo
-
Connecting my iphone to wall charger makes my iphone touch not work properly
I have a iphone 3g and a orignal apple wall charger when i connect my iphone to charger it makes my iphone touch not work properly, but it charges the phone properly.the data cable is also working properly. So can any one help me out here.thanks
-
Two pivot tables in one report - obiee 11.1.1.5
Hi experts! Is possible to do a two pivot table (EACH with DIFFERENT CALCULATED ITEMS)?? For example: Pivot table: Calc1 Calc2 Calc3 (Hide all other members dimension, only show 3 Calc items) Pivot table2: Calc5 Calc6 Calc7 (Hide all other members di