Reload for the historical data
Hi All,
I have an R/3 extractor '2LIS_04_P_ARBPL' which loads deltas everyday. Some time back on 16-06-09, masterdata(ZRESOURCE) was changed. Since then delta loads have happened & the extractor has fetched proper transaction data for the changed masterdata i.e. for the new resource. But for the prior period to 16-06-09, it still refers to the old resource. But the client wants that the transaction data of the prior period should also refer to the new (Changed) resource.
Pls advice me as to what I should do. Delete the existing data relating to the prior period & reload the historical data? If so how should I go abt it? And any precautions as its a delta load.
Thanks & Regards,
Shreesh.
Hi Shreesh,
We face similar kind of situation too. Example of such master data change is PL/PC reorganization.
In these case if any field are picking value from master data then master data present at the time of loading will be updated in cube and in future if master data change,you will see both old and new values in cube.
Solution for this is :-
1) try to make the client understand by explaining the whole scenario, client may get convinced.
2) Do selective deletion and load again data in cube
3) don't store values for that field in cube instead make the field as navigational attribute in cube, if you do this then value present at the time of execution of report will be displayed in report output.
Regards,
Saurabh
Similar Messages
-
Looking for a manual for the Historical Data Viewer Lookout 6.1
Is there a manual on the Historical Data Viewer (Measurement and Automation Explorer)?
I have never logged any analog data, but want to log (to historical data) the battery voltage at a remote site during a power outage.
I have the object established and have a display showing the real-time battery voltage. I have selected logging and have set the resolution and the deviation.
I am stuck where it comes to the historical data viewer. I believe that I need to set up a trace to view the data, but can not find a manual for the Historical Data Viewer (Measurement and Automation Explorer) so that I can read up on it.
Does anyone have a link to a manual?
Thanks, AlanYou have to configure the Scale parameter of Pot in Lookout. The Historical Data Viewer just reads data from database the show it. It can't scale it.
But I can't find a way to add a reference line to the view. Maybe this can be a new feature.
Ryan Shi
National Instruments -
How to extract the historical data from R/3
hi
I am extracting data from R/3 through LO Extraction. client asked me to enhance the data source by adding field. i have enhanced the field and wrote exit to populate the data for that field.
how to extract the historical data into BI for the enhanced field. already delta load is running in BI.
regardsHi Satish,
As per SAP Standard also the best way is to delete whole data from the cube and then load the data from set up tables as you have enhanced the data source.
After data source enhancement it is supported to load normally because you don't get any historical data for that field.
Best way is to take down time from the users, normally we do in weekends/non-business hours.
Then fill the set-up tables; if the data is of huge volume you can adopt parallel mechanism like:
1. Load set-up tables by yearly basis as a background job.
2. Load set-up tables by yearly basis with posting periods from jan 1st to 31st dec of any year basis as a background job.
This can make your self easier and faster for load of set-up tables. After filling up set-up tables. You can unlock all users as there is no worries of postings.
Then after you can load all the data into BI first into PSA and then into Cube.
Regards,
Ravi Kanth. -
Insufficient Disk Spave for logging historical data
I am recieving an error prompt which reads "Insufficient disk space for loggong historical data." At which point I can no longer log any data. I am using Labview version 6.1, I have deleted all data from my database at well as from my syslog folder (just in case one of them was corrupt). However the problem re-occured shortly after this was done. I also changed the location (in DSC) of where my data was being stored, and this did not work either. Any suggestions?
Greetings,
The first thing that you may want to check is to ensure you have enough disk space on the particular drive that you are logging to. This could be the direct cause if you are running out of HD space. Assuming that this isn't the case, I would check to see if you have any deadbands set up on the tags or if the deadbands are set to zero for the tags you are logging. The logging buffer could be overflowing if the tags are constantly changing and DSC is trying to log all values. Lastly, check and make sure that the tag engine is still running at the time you receive the error message. If it is, open it up by double-clicking on the icon in the system tray(in the bottom, right-hand corner of Windows, by your time) and make sure that "log data" is still c
hecked. Hopefully the solution will present itself by checking these things. Have a good day. -
CBO generating different plans for the same data in similar Environments
Hi All
I have been trying to compare an SQL from 2 different but similar environments build of the same hardware specs .The issue I am facing is environment A, the query executes in less than 2 minutes with plan mostly showing full table scans and hash join whereas in environment B(problematic), it times out after 2 hours with an error of unable to extend table space . The statistics are up to date in both environments for both tables and indexes . System parameters are exactly similar(default oracle for except for multiblock_read_count ).
Both Environment have same db parameter for db_file_multiblock_read_count(16), optimizer(refer below),hash_area_size (131072),pga_aggregate_target(1G),db_block_size(8192) etc . SREADTIM, MREADTIM, CPUSPEED, MBRC are all null in aux_stats in both environment because workload was never collected i believe.
Attached is details about the SQL with table stats, SQL and index stats my main concern is CBO generating different plans for the similar data and statistics and same hardware and software specs. Is there any thing else I should consider .I generally see environment B being very slow and always plans tend to nested loops and index scan whereas what we really need is a sensible FTS in many cases. One of the very surprising thing is METER_CONFIG_HEADER below which has just 80 blocks of data is being asked for index scan.
show parameter optimizer
optimizer_dynamic_sampling integer 2
optimizer_features_enable string 10.2.0.4
optimizer_index_caching integer 0
optimizer_index_cost_adj integer 100
optimizer_mode string ALL_ROWS
optimizer_secure_view_merging boolean TRUE
**Environment**
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Solaris: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - Production
Note: : There are slight difference in the no of records in the attached sheet.However, I wanted to tell that i have tested with exact same data and was getting similar results but I couldn't retain the data untill collecting the details in the attachment
TEST1 COMPARE TABLE LEVE STATS used by CBO
ENVIRONMENT A
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 3607425 167760 5/02/2013 22:11
METER_CONFIG_HEADER 3658 80 5/01/2013 0:07
METER_CONFIG_ITEM 32310 496 5/01/2013 0:07
NMI 1899024 33557 18/02/2013 10:55
REGISTER 4830153 101504 18/02/2013 9:57
SDP_LOGICAL_ASSET 1607456 19137 18/02/2013 15:48
SDP_LOGICAL_REGISTER 5110781 78691 18/02/2013 9:56
SERVICE_DELIVERY_POINT 1425890 42468 18/02/2013 13:54
ENVIRONMENT B
TABLE_NAME NUM_ROWS BLOCKS LAST_ANALYZED
ASSET 4133939 198570 16/02/2013 10:02
METER_CONFIG_HEADER 3779 80 16/02/2013 10:55
METER_CONFIG_ITEM 33720 510 16/02/2013 10:55
NMI 1969000 33113 16/02/2013 10:58
REGISTER 5837874 120104 16/02/2013 11:05
SDP_LOGICAL_ASSET 1788152 22325 16/02/2013 11:06
SDP_LOGICAL_REGISTER 6101934 91088 16/02/2013 11:07
SERVICE_DELIVERY_POINT 1447589 43804 16/02/2013 11:11
TEST ITEM 2 COMPARE INDEX STATS used by CBO
ENVIRONMENT A
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 9878 67 147 12982 869801 3553095
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 7291 2747 2 639 1755977 3597916
ASSET SYS_C00102592 UNIQUE 2 12488 3733831 1 1 3726639 3733831
METER_CONFIG_HEADER SYS_C0092052 UNIQUE 1 12 3670 1 1 3590 3670
METER_CONFIG_ITEM SYS_C0092074 UNIQUE 1 104 32310 1 1 32132 32310
NMI IDX_NMI_ID NONUNIQUE 2 6298 844853 1 2 1964769 1965029
NMI IDX_NMI_ID_NK NONUNIQUE 2 6701 1923072 1 1 1922831 1923084
NMI IDX_NMI_STATS NONUNIQUE 1 106 4 26 52 211 211
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 12498 795 15 2899 2304831 4711808
REGISTER SYS_C00102653 UNIQUE 2 16942 5065660 1 1 5056855 5065660
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 3667 1607968 1 1 1607689 1607982
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 3811 668727 1 2 1606204 1607982
SDP_LOGICAL_ASSET SYS_C00102665 UNIQUE 2 5116 1529606 1 1 1528136 1529606
SDP_LOGICAL_REGISTER SYS_C00102677 UNIQUE 2 17370 5193638 1 1 5193623 5193638
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 4406 676523 1 2 1423247 1425890
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 7374 676523 1 2 1458238 1461108
SERVICE_DELIVERY_POINT SYS_C00102687 UNIQUE 2 4737 1416207 1 1 1415022 1416207
ENVIRONMENT B
TABLE_NAME INDEX_NAME UNIQUENESS BLEVEL LEAF_BLOCKS DISTINCT_KEYS AVG_LEAF_BLOCKS_PER_KEY AVG_DATA_BLOCKS_PER_KEY CLUSTERING_FACTOR NUM_ROWS
ASSET IDX_AST_DEVICE_CATEGORY_SK NONUNIQUE 2 8606 121 71 16428 1987833 4162257
ASSET IDX_A_SAPINTLOGDEV_SK NONUNIQUE 2 8432 1780146 1 1 2048170 4162257
ASSET SYS_C00116157 UNIQUE 2 13597 4162263 1 1 4158759 4162263
METER_CONFIG_HEADER SYS_C00116570 UNIQUE 1 12 3779 1 1 3734 3779
METER_CONFIG_ITEM SYS_C00116592 UNIQUE 1 107 33720 1 1 33459 33720
NMI IDX_NMI_ID NONUNIQUE 2 6319 683370 1 2 1970460 1971313
NMI IDX_NMI_ID_NK NONUNIQUE 2 6597 1971293 1 1 1970771 1971313
NMI IDX_NMI_STATS NONUNIQUE 1 98 48 2 4 196 196
REGISTER REG_EFFECTIVE_DTM NONUNIQUE 2 15615 1273 12 2109 2685924 5886582
REGISTER SYS_C00116748 UNIQUE 2 19533 5886582 1 1 5845565 5886582
SDP_LOGICAL_ASSET IDX_SLA_SAPINTLOGDEV_SK NONUNIQUE 2 4111 1795084 1 1 1758441 1795130
SDP_LOGICAL_ASSET IDX_SLA_SDP_SK NONUNIQUE 2 4003 674249 1 2 1787987 1795130
SDP_LOGICAL_ASSET SYS_C004520 UNIQUE 2 5864 1795130 1 1 1782147 1795130
SDP_LOGICAL_REGISTER SYS_C004539 UNIQUE 2 20413 6152850 1 1 6073059 6152850
SERVICE_DELIVERY_POINT IDX_SDP_NMI_SK NONUNIQUE 2 3227 660649 1 2 1422572 1447803
SERVICE_DELIVERY_POINT IDX_SDP_SAP_INT_NMI_SK NONUNIQUE 2 6399 646257 1 2 1346948 1349993
SERVICE_DELIVERY_POINT SYS_C00128706 UNIQUE 2 4643 1447946 1 1 1442796 1447946
TEST ITEM 3 COMPARE PLANS
ENVIRONMENT A
Plan hash value: 4109575732
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 13 | 2067 | | 135K (2)| 00:27:05 |
| 1 | HASH UNIQUE | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 2 | HASH JOIN | | 13 | 2067 | | 135K (2)| 00:27:05 |
|* 3 | HASH JOIN | | 6 | 900 | | 135K (2)| 00:27:04 |
|* 4 | HASH JOIN ANTI | | 1 | 137 | | 135K (2)| 00:27:03 |
|* 5 | TABLE ACCESS BY INDEX ROWID| NMI | 1 | 22 | | 5 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | | 95137 (2)| 00:19:02 |
|* 7 | HASH JOIN | | 1 | 109 | | 95132 (2)| 00:19:02 |
|* 8 | TABLE ACCESS FULL | ASSET | 36074 | 1021K| | 38553 (2)| 00:07:43 |
|* 9 | HASH JOIN | | 90361 | 7059K| 4040K| 56578 (2)| 00:11:19 |
|* 10 | HASH JOIN | | 52977 | 3414K| 2248K| 50654 (2)| 00:10:08 |
|* 11 | HASH JOIN | | 39674 | 1782K| | 40101 (2)| 00:08:02 |
|* 12 | TABLE ACCESS FULL | REGISTER | 39439 | 1232K| | 22584 (2)| 00:04:32 |
|* 13 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 675K| 12M| | 9412 (2)| 00:01:53 |
|* 15 | TABLE ACCESS FULL | SDP_LOGICAL_ASSET | 1178K| 15M| | 4262 (2)| 00:00:52 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | | 2 (0)| 00:00:01 |
| 17 | VIEW | | 39674 | 232K| | 40101 (2)| 00:08:02 |
|* 18 | HASH JOIN | | 39674 | 1046K| | 40101 (2)| 00:08:02 |
|* 19 | TABLE ACCESS FULL | REGISTER | 39439 | 500K| | 22584 (2)| 00:04:32 |
|* 20 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 4206K| 56M| | 17490 (2)| 00:03:30 |
|* 21 | TABLE ACCESS FULL | METER_CONFIG_HEADER | 3658 | 47554 | | 19 (0)| 00:00:01 |
|* 22 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 7590 | 68310 | | 112 (2)| 00:00:02 |
Predicate Information (identified by operation id):
2 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
3 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
4 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
5 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
7 - access("ASSET_CD"="EQUIP_CD" AND "SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
8 - filter("ROW_CURRENT_IND"='Y')
9 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
10 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
11 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
12 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='4' OR
SUBSTR("REGISTER_ID_CD",1,1)='5' OR SUBSTR("REGISTER_ID_CD",1,1)='6') AND "ROW_CURRENT_IND"='Y')
13 - filter("ROW_CURRENT_IND"='Y')
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y')
16 - access("NMI_SK"="NMI_SK")
18 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
19 - filter("REGISTER_TYPE_CD"='C' AND (SUBSTR("REGISTER_ID_CD",1,1)='1' OR
SUBSTR("REGISTER_ID_CD",1,1)='2' OR SUBSTR("REGISTER_ID_CD",1,1)='3') AND "ROW_CURRENT_IND"='Y')
20 - filter("ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
22 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
ENVIRONMENT B
Plan hash value: 2826260434
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 181 | 103K (2)| 00:20:47 |
| 1 | HASH UNIQUE | | 1 | 181 | 103K (2)| 00:20:47 |
|* 2 | HASH JOIN ANTI | | 1 | 181 | 103K (2)| 00:20:47 |
|* 3 | HASH JOIN | | 1 | 176 | 56855 (2)| 00:11:23 |
|* 4 | HASH JOIN | | 1 | 163 | 36577 (2)| 00:07:19 |
|* 5 | TABLE ACCESS BY INDEX ROWID | ASSET | 1 | 44 | 4 (0)| 00:00:01 |
| 6 | NESTED LOOPS | | 1 | 131 | 9834 (2)| 00:01:59 |
| 7 | NESTED LOOPS | | 1 | 87 | 9830 (2)| 00:01:58 |
| 8 | NESTED LOOPS | | 1 | 74 | 9825 (2)| 00:01:58 |
|* 9 | HASH JOIN | | 1 | 52 | 9820 (2)| 00:01:58 |
|* 10 | TABLE ACCESS BY INDEX ROWID| METER_CONFIG_HEADER | 1 | 14 | 1 (0)| 00:00:01 |
| 11 | NESTED LOOPS | | 1 | 33 | 116 (2)| 00:00:02 |
|* 12 | TABLE ACCESS FULL | METER_CONFIG_ITEM | 1 | 19 | 115 (2)| 00:00:02 |
|* 13 | INDEX RANGE SCAN | SYS_C00116570 | 1 | | 1 (0)| 00:00:01 |
|* 14 | TABLE ACCESS FULL | SERVICE_DELIVERY_POINT | 723K| 13M| 9699 (2)| 00:01:57 |
|* 15 | TABLE ACCESS BY INDEX ROWID | NMI | 1 | 22 | 5 (0)| 00:00:01 |
|* 16 | INDEX RANGE SCAN | IDX_NMI_ID_NK | 2 | | 2 (0)| 00:00:01 |
|* 17 | TABLE ACCESS BY INDEX ROWID | SDP_LOGICAL_ASSET | 1 | 13 | 5 (0)| 00:00:01 |
|* 18 | INDEX RANGE SCAN | IDX_SLA_SDP_SK | 2 | | 2 (0)| 00:00:01 |
|* 19 | INDEX RANGE SCAN | IDX_A_SAPINTLOGDEV_SK | 2 | | 2 (0)| 00:00:01 |
|* 20 | TABLE ACCESS FULL | REGISTER | 76113 | 2378K| 26743 (2)| 00:05:21 |
|* 21 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
| 22 | VIEW | | 90889 | 443K| 47021 (2)| 00:09:25 |
|* 23 | HASH JOIN | | 90889 | 2307K| 47021 (2)| 00:09:25 |
|* 24 | TABLE ACCESS FULL | REGISTER | 76113 | 966K| 26743 (2)| 00:05:21 |
|* 25 | TABLE ACCESS FULL | SDP_LOGICAL_REGISTER | 5095K| 63M| 20245 (2)| 00:04:03 |
Predicate Information (identified by operation id):
2 - access("SERVICE_DELIVERY_POINT_SK"="TMP"."SERVICE_DELIVERY_POINT_SK")
3 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK" AND
"SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
4 - access("ASSET_CD"="EQUIP_CD")
5 - filter("ROW_CURRENT_IND"='Y')
9 - access("NETWORK_TARIFF_CD"="NETWORK_TARIFF_CD")
10 - filter("ROW_CURRENT_IND"='Y')
12 - filter("ROW_CURRENT_IND"='Y' AND "CONROL_REGISTER"='X')
13 - access("METER_CONFIG_HEADER_SK"="METER_CONFIG_HEADER_SK")
14 - filter("ROW_CURRENT_IND"='Y')
15 - filter("ROW_CURRENT_IND"='Y' AND ("NMI_STATUS_CD"='A' OR "NMI_STATUS_CD"='D'))
16 - access("NMI_SK"="NMI_SK")
17 - filter("ROW_CURRENT_IND"='Y')
18 - access("SERVICE_DELIVERY_POINT_SK"="SERVICE_DELIVERY_POINT_SK")
19 - access("SAP_INT_LOG_DEVICE_SK"="SAP_INT_LOG_DEVICE_SK")
20 - filter((SUBSTR("REGISTER_ID_CD",1,1)='4' OR SUBSTR("REGISTER_ID_CD",1,1)='5' OR
SUBSTR("REGISTER_ID_CD",1,1)='6') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
21 - filter("ROW_CURRENT_IND"='Y')
23 - access("SAP_INT_LOGICAL_REGISTER_SK"="SAP_INT_LOGICAL_REGISTER_SK")
24 - filter((SUBSTR("REGISTER_ID_CD",1,1)='1' OR SUBSTR("REGISTER_ID_CD",1,1)='2' OR
SUBSTR("REGISTER_ID_CD",1,1)='3') AND "REGISTER_TYPE_CD"='C' AND "ROW_CURRENT_IND"='Y')
25 - filter("ROW_CURRENT_IND"='Y')Edited by: abhilash173 on Feb 24, 2013 9:16 PM
Edited by: abhilash173 on Feb 24, 2013 9:18 PMHi Paul,
I misread your question initially .The system stats are outdated in both ( same result as seen from aux_stats) .I am not a DBA and do not have access to gather system stats fresh.
select * from sys.aux_stats$
SNAME PNAME PVAL1 PVAL2
SYSSTATS_INFO STATUS NULL COMPLETED
SYSSTATS_INFO DSTART NULL 02-16-2011 15:24
SYSSTATS_INFO DSTOP NULL 02-16-2011 15:24
SYSSTATS_INFO FLAGS 1 NULL
SYSSTATS_MAIN CPUSPEEDNW 1321.20523 NULL
SYSSTATS_MAIN IOSEEKTIM 10 NULL
SYSSTATS_MAIN IOTFRSPEED 4096 NULL
SYSSTATS_MAIN SREADTIM NULL NULL
SYSSTATS_MAIN MREADTIM NULL NULL
SYSSTATS_MAIN CPUSPEED NULL NULL
SYSSTATS_MAIN MBRC NULL NULL
SYSSTATS_MAIN MAXTHR NULL NULL
SYSSTATS_MAIN SLAVETHR NULL NULL -
Multiple MM schedule lines for the same date
Hello All Gurus,
I am facing a problem where the MM schedule lines are divided into various quantites for the same date. For example on 02/11/2010 the total quantity is 120, it is appearing in the below form in ME38:
02/11/2010 - 40
02/11/2010 - 40
02/11/2010 - 40
Although this is not an error, but my client is not happy with this. Can someone tell me why I am getting this error? Is it because of the rounding value that we maintain in MRP 1 view of material master record? Or is there any other setting that I am missing?
Please let me know. I would be very much thankful to you.
Thanks and Regards,
UmakanthHi,
Before changing the setup, note that sometimes it is beneficial to split one demand into multiple delivery schedules. Especially if the demand is bigger than equipment type (container/truck etc.). If you want to manage inbound transportation for such delivery schedules, it is much simpler to do this when each delivery schedule is not bigger than one equipment.
Regards,
Dominik Modrzejewski -
How to use Add Query Criteria for the MySQL data Base in Netbeans ?
How to use Add Query Criteria for the MySQL data Base in Netbeans Visual web pack.
When the Query Criteria is add like
SELECT ALL counselors.counselors_id, counselors.first_name, counselors.telephone,counselors.email
FROM counselors WHERE counselors.counselors_id = ?
when i run this Query in the Query Window
i get a error message Box saying
Query Processing Error Parameter metadata not available for the given statement
if i run the Query with out Query Criteria its working fine.*I am glad I am not the only one who have this problem. Part of issue has been described as above, there are something more in my case.
Whenever I try to call ****_tabRowSet.setObject(1, userDropList.getSeleted()); I got error message as shown below:*
The Java codes are:
public void dropDown1_processValueChange(ValueChangeEvent event) {
Object s = this.dropDown1.getSelected();
try {
this.User_tabDataProvider1.setCursorRow(this.User_tabDataProvider1.findFirst("User_Tab.User_ID", s));
this.getSessionBean1().getTrip_tabRowSet1().setObject(1, s);
this.Trip_tabDataProvider1.refresh();
} catch (Exception e) {
this.log("Error: ", e);
this.error("Error: Cannot select user"+e.getMessage());
SQL statement for Trip_tabRowSet:
SELECT ALL Trip_Tab.Trip_Date,
Trip_Tab.User_ID,
Trip_Tab.Destination
FROM Trip_Tab
WHERE Trip_Tab.User_ID = ?
Error messages are shown below:
phase(RENDER_RESPONSE 6,com.sun.faces.context.FacesContextImpl@5abf3f) threw exception: com.sun.rave.web.ui.appbase.ApplicationException: java.sql.SQLException: No value specified for parameter 1 java.sql.SQLException: No value specified for parameter 1
com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.cleanup(ViewHandlerImpl.java:559)
com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.afterPhase(ViewHandlerImpl.java:435)
com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:274)
com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:140)
javax.faces.webapp.FacesServlet.service(FacesServlet.java:245)
org.apache.catalina.core.ApplicationFilterChain.servletService(ApplicationFilterChain.java:397)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
com.sun.webui.jsf.util.UploadFilter.doFilter(UploadFilter.java:240)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
org.netbeans.modules.web.monitor.server.MonitorFilter.doFilter(MonitorFilter.java:368)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:276)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:240)
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:179)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:73)
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:182)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
com.sun.enterprise.web.VirtualServerPipeline.invoke(VirtualServerPipeline.java:120)
org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:137)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:239)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.invokeAdapter(ProcessorTask.java:667)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.processNonBlocked(ProcessorTask.java:574)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:844)
com.sun.enterprise.web.connector.grizzly.ReadTask.executeProcessorTask(ReadTask.java:287)
com.sun.enterprise.web.connector.grizzly.ReadTask.doTask(ReadTask.java:212)
com.sun.enterprise.web.connector.grizzly.TaskBase.run(TaskBase.java:252)
com.sun.enterprise.web.connector.grizzly.WorkerThread.run(WorkerThread.java:75)
tandardWrapperValve[Faces Servlet]: Servlet.service() for servlet Faces Servlet threw exception
java.sql.SQLException: No value specified for parameter 1
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:910)
at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:1674)
at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:1622)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1332)
at com.sun.sql.rowset.internal.CachedRowSetXReader.readData(CachedRowSetXReader.java:193)
at com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:979)
at com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:1439)
at com.sun.data.provider.impl.CachedRowSetDataProvider.checkExecute(CachedRowSetDataProvider.java:1274)
at com.sun.data.provider.impl.CachedRowSetDataProvider.setCursorRow(CachedRowSetDataProvider.java:335)
at com.sun.data.provider.impl.CachedRowSetDataProvider.setCursorIndex(CachedRowSetDataProvider.java:306)
at com.sun.data.provider.impl.CachedRowSetDataProvider.getRowCount(CachedRowSetDataProvider.java:639)
at com.sun.webui.jsf.component.TableRowGroup.getRowKeys(TableRowGroup.java:1236)
at com.sun.webui.jsf.component.TableRowGroup.getFilteredRowKeys(TableRowGroup.java:820)
at com.sun.webui.jsf.component.TableRowGroup.getRowCount(TableRowGroup.java:1179)
at com.sun.webui.jsf.component.Table.getRowCount(Table.java:831)
at com.sun.webui.jsf.renderkit.html.TableRenderer.renderTitle(TableRenderer.java:420)
at com.sun.webui.jsf.renderkit.html.TableRenderer.encodeBegin(TableRenderer.java:143)
at javax.faces.component.UIComponentBase.encodeBegin(UIComponentBase.java:810)
at com.sun.webui.jsf.component.Table.encodeBegin(Table.java:1280)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:881)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at com.sun.faces.application.ViewHandlerImpl.doRenderView(ViewHandlerImpl.java:271)
at com.sun.faces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:182)
at com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.renderView(ViewHandlerImpl.java:285)
at com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:133)
at com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:244)
at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:140)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:245)
at org.apache.catalina.core.ApplicationFilterChain.servletService(ApplicationFilterChain.java:397)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at com.sun.webui.jsf.util.UploadFilter.doFilter(UploadFilter.java:240)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at org.netbeans.modules.web.monitor.server.MonitorFilter.doFilter(MonitorFilter.java:368)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:276)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
at org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:240)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:179)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:73)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:182)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at com.sun.enterprise.web.VirtualServerPipeline.invoke(VirtualServerPipeline.java:120)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:137)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
at org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:239)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.invokeAdapter(ProcessorTask.java:667)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.processNonBlocked(ProcessorTask.java:574)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:844)
at com.sun.enterprise.web.connector.grizzly.ReadTask.executeProcessorTask(ReadTask.java:287)
at com.sun.enterprise.web.connector.grizzly.ReadTask.doTask(ReadTask.java:212)
at com.sun.enterprise.web.connector.grizzly.TaskBase.run(TaskBase.java:252)
at com.sun.enterprise.web.connector.grizzly.WorkerThread.run(WorkerThread.java:75)
Also when I tried to update my MYSQL connector / J driver to version 5.1.5 from 5.0.5 (NB 5.5.1) and 5.0.7 (NB 6.1), I could not get it work (looooong time to search some JDBC classes and with no response in the end) on both of my Netbean 5.5.1(on PC) and Netbean 6.1(on laptop) IDEs.
Could anybody look into this issue.
Many thanks
Edited by: linqing on Nov 22, 2007 4:48 AM -
Dependent master data for the Transaction data
Hi,
Wish u Happy New Year to All.
I use to schedule daily some 250 master data.Then i will run transaction data.Because of this the system performance is effecting.I want to know, where i can find dependent master data objects for the particular transaction data?
Thanks in Advance
Regards,
SivDear Siva
In the Info Source Over View for the particular data target can shown dependent master data objects.
Regards
Saravanan.ar -
Data Sources for the Master data of FI-CO,MM and SD from R3
Hi Gurus ,
Could you please tell me what are the main datasources of FI-CO,MM and SD needed from R/3 for the master data to be loaded in new implementation of BI .
Could you please guide me step by step .
thanks in advance ,
PrathamHi
It varies from Client and project. But the Most common SD & MM are from LO **** pit.
help.sap.com is the best source for this
Hope it helps -
How to see the historic data of CAT2
Dear All,
I have a question related to CATS.
Through which table I can get the historical data of an employee which is stored in CAT2 transaction. I tried through CATSDB but not able to get the number of working hours as stored in CAT2.
I clied on a field of CAT2 and checked the technical details. there I found the table name CATD but when I run SE11 and enter this table name, it display it as a sturcture not as a table.
Please provide your help in this regard.
Regards,
-NehaMaximum NUmber of Columns allowed is 1023. So if there are more data than that, I am afraid there is no way to see them all in one screen.
the better thing to do is to use the Settings>Format List>Choose fields option from the selection screen of SE16 to just choose the fields which you want in the output.
It is highly unlikely that you are using all the 100 fields so you can very well hide a few of them with no impact on your output.
As someone else has suggested using a report such as CATSXT_DA will defintiely be a much more useful way of viewing all relevant fields from CATSDB. -
[HELP] Error: "JDBC theme based FOI support is disabled for the given data"
Hi all,
I have already set up MapViewer version 10.1.3.3 on BISE1 10g OC4J server. I am current using JDK 1.6. I create a mvdemo/mvdemo user for demo data.
The MapViewer demo is running fine with some demo without CHART. But give this error with some maps that have CHART like: "Dynamic theme, BI data and Pie chart style", "Dynamic theme and dynamic Bar chart style". The error is:
----------ERROR------------
Cannot process the FOI response from MapViewer server. Server message is: <?xml version=\"1.0\" encoding=\"UTF-8\" ?> <oms_error> MAPVIEWER-06009: Error processing an FOI request.\nRoot cause:FOIServlet:MAPVIEWER-06016: JDBC theme based FOI support is disabled for the given data source. [mvdemo]</oms_error>.
----------END ERROR------
I searched many threads on this forum, some point me to have this allow_jdbc_theme_based_foi="true" in mapViewerConfig.xml and restart MapViewer.
<map_data_source name="mvdemo"
jdbc_host="localhost"
jdbc_sid="bise1db"
jdbc_port="1521"
jdbc_user="mvdemo"
jdbc_password="mvdemo"
jdbc_mode="thin"
number_of_mappers="3"
allow_jdbc_theme_based_foi="true"
/>
Error Images: [http://i264.photobucket.com/albums/ii176/necrombi/misc/jdbcerror.png|http://i264.photobucket.com/albums/ii176/necrombi/misc/jdbcerror.png]
I have this configuration, but no luck. Could anyone show me how to resolve this problem?
Rgds,
Dung NguyenOop, i managed to have this prob resolved!
My prob may come from this I use both scott and mvdemo schema for keeping demo data (import demo data).
Steps i made to resolve my prob are:
1) Undeploy MapViewer from Application Server Control (http://localhost:9704/em in my case)
2) Drop user mvdemo
3) Download mapviewer kit from Oracle Fusion Middleware MapViewer & Map Builder Version 11.1.1.2
4) Deploy MapViewer again
5) Recreate mvdemo and import demo data
6) Run mcsdefinition.sql, mvdemo.sql with mvdemo user (granted dba already)
7) Edit mapViewerConfig.xml
<map_data_source name="mvdemo"
jdbc_host="dungnguyen-pc"
jdbc_sid="bise1db"
jdbc_port="1521"
jdbc_user="mvdemo"
jdbc_password="!mvdemo"
jdbc_mode="thin"
number_of_mappers="3"
allow_jdbc_theme_based_foi="true"
/>
Save & Restart MapViewer
And now, all demos run fine. Hope this helpful for who meet prob like my case. -
Min,Max for a time Range and Sum of Records for the last date/time
Hi All,
I have a table with the following structure/data:
create table Events (
[EventID]
int NOT NULL,
[Title]
nvarchar(200) NOT NULL,
[SourceName] nvarchar(20) NOT NULL,
[Type]
int NOT NULL,
[eDate]
datetime
insert into Events values(100, 'Event 1', 'S01', 3,'2014-01-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S07', 3,'2014-01-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S08', 3,'2014-01-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S09', 3,'2014-01-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S010', 3,'2014-01-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S03', 3,'2014-01-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S04', 3,'2014-01-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S05', 3,'2014-01-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S06', 3,'2014-01-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S01', 3,'2014-02-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S02', 3,'2014-02-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S03', 3,'2014-02-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S04', 3,'2014-02-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S05', 3,'2014-02-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S06', 3,'2014-02-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S01', 3,'2014-03-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S02', 3,'2014-03-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S03', 3,'2014-03-01 00:00:00.000')
insert into Events values(100, 'Event 1', 'S04', 3,'2014-03-01 00:00:00.000')
insert into Events values(101, 'Event 2', 'S05', 3,'2014-03-01 00:00:00.000')
insert into Events values(102, 'Event 3', 'S06', 3,'2014-03-01 00:00:00.000')
And I wrote the following query:
select EventID as [Event ID],
Title,
count(distinct(SourceName)) as [Instances],
Type,
min(eDate) as [First Detected],
max(eDate) as [Last Detected],
datediff(d,min(eDate),max(eDate)) as [Delta (days)]
from Events
where type = 3
group by EventID, Title, Type
having max(eDate) <> min(eDate)
and max(eDate) =(select top 1 eDate from Events order by eDate desc)
and I get the following results (see the instance number)
Event ID Title Instances Type First Detected Last Detected
Delta (days)
=============================================================================================================================
100 Event 1 5 3 2014-01-01 00:00:00.000 2014-03-01 00:00:00.000
59
101 Event 2
3 3 2014-01-01 00:00:00.000 2014-03-01 00:00:00.000 59
102 Event 3
2 3 2014-01-01 00:00:00.000 2014-03-01 00:00:00.000 59
This is normal for this query however what I need to do is a little different.
In other words, while I need to show when we recorded a specific event first and last time,
I need to display the results for the last date/time when it was recorded.
For example what I need to provide should look like this:
Event ID Title Instances Type First Detected
Last Detected Delta (days)
=============================================================================================================================
100 Event 1 2 3
2014-01-01 00:00:00.000 2014-03-01 00:00:00.000 59
101 Event 2 2 3
2014-01-01 00:00:00.000 2014-03-01 00:00:00.000 59
102 Event 3
2 3 2014-01-01 00:00:00.000 2014-03-01 00:00:00.000
59
Could you please help me to fix this query?
TIA,
John;With cte As
(Select EventID as [Event ID],
Title,
SourceName,
Type,
min(eDate) Over(Partition By EventID, Title, Type) as [First Detected],
max(eDate) Over(Partition By EventID, Title, Type) as [Last Detected],
eDate,
datediff(d,min(eDate) Over(Partition By EventID, Title, Type),max(eDate) Over(Partition By EventID, Title, Type)) as [Delta (days)],
max(eDate) Over() As MaxEDate
from Events
where type = 3)
Select [Event ID],
Title,
COUNT(Distinct SourceName) As Instances,
Type,
[First Detected],
[Last Detected],
[Delta (days)]
From cte
Where eDate = MaxEDate And [First Detected] <> [Last Detected]
Group By [Event ID],
Title,
Type,
[First Detected],
[Last Detected],
[Delta (days)];
Tom
P.S. Thanks for providing the DDL and data. That is always very helpful. -
Table name for the transaction data
Hello Gurus,
what is the table name for the transaction data which we have created through t.code PRO5. we are generating report for the total kms per vehicle type for a employee. these data in which tables can i get? the data related to trips ?
thanks in advance
regds
ramachandraHi,
Pl use RPR_TRIP_HEADER_DATA programme for getting general trip data, which will have details about number of KM travelled by employee for each trip, amount to reimbursed to the employee etc.
Thanks,
Nandagopal C -
JCo connection for the application data
Dear All,
Now, we configure BP for MSS. When we set JCo connection for the application data, from this link http://help.sap.com/saphelp_erp2005/helpdata/en/29/d7844205625551e10000000a1550b0/frameset.htm we selected the method "Ticket".
When we run the page in "Team" area, we found some errors as follow:
- You do not have the authorization to start service sap.com/pcui_gp~isr/IsrStatusoverview
- You do not have the authorization to start service sap.com/mssreccand/CandidateStatusOverview
- Pages about "Employee Information" display Blank iViews.
But if we set JCo connection for application data by using "User/Password" method (UserID have "SAP_ALL" profile in back-end system), the application work fine.
Kindy advise us which method should be used to set JCo connection (Ticket or User/Password).
If the method should be "User/Password", please advise authorization and user type for UserID that be used to map JCo connection.
Thank you very much,
Anek.Hi,
Go through the contents in,
http://help.sap.com/saphelp_ep60sp2/helpdata/en/8f/67d27676ace84080964d4c4223bb3c/content.htm
Hope this helps !
Regards
Srinivasan T -
BPS You have no authorization for the requested data
We are implementing Hierarchy node based security for our BPS.
When the user tries to display the planning layout, they get the error message "You have no authorization for the requested data "
I have given authorization to the relavant Infocubes, also checked the all the Authorization Relavant Info Objects and added theses Info Object to the custom authorization created in RSECADMIN.
Also added the info objects 0TCAACTVT, 0TCAIPROV, 0TCAVALID to the custom authorization.
In pfcg, this authorization has been added to S_RS_AUTH. I have also given activity 02, 03, 16 values and a * to planning areas, functions, packages, groups, levels, folders, ... to the objects R_AREA
R_BUNDLE
R_METHOD
R_PACKAGE
R_PARAM
R_PLEVEL
R_PM_NAME
R_PROFILE
But still we get the same error.
Has anyone encountered this problem? Can you please provide me some clues to resolve this issueThank you very much Grevaz, but that template does not help.
I did run both ST01 trace and BI RSECADMIN trace. RSECADMIN Trace shows the below authorization failure
Subselection (Technical SUBNR) 1
Supplementation of Selection for Aggregated Characteristics
No Check for Aggregation Authorization Required
Following Set Is Checked Comparison with Following Authorized Set Result Remaining Quantity
Characteristic Contents
0FUNDS_CTR
0TCAACTVT
SQL Format:
FUNDS_CTR BETWEEN '4012001000'
AND '4012001999'
AND TCAACTVT = '03'
Characteristic Contents
0FUNDS_CTR Node 1 I EQ #
I EQ :
0TCAACTVT I EQ 02
I EQ 03
Partially Authorized (Average) Characteristic Contents
0FUNDS_CTR
0TCAACTVT
SQL Format:
FUNDS_CTR > '4012001000'
AND FUNDS_CTR <= '4012001999'
AND NOT FUNDS_CTR IN ('4012001001','4012001002','4012001003','4012001004','4012001005','4012001006','4012001007','4012001008','4012001009','4012001010')
AND TCAACTVT = '03'
Value selection partially authorized. Check of remainder at end
Following Set Is Checked Comparison with Following Authorized Set Result Remaining Quantity
Characteristic Contents
0FUNDS_CTR
0TCAACTVT
SQL Format:
FUNDS_CTR > '4012001000'
AND FUNDS_CTR <= '4012001999'
AND NOT FUNDS_CTR IN ('4012001001','4012001002','4012001003','4012001004','4012001005','4012001006','4012001007','4012001008','4012001009','4012001010')
AND TCAACTVT = '03'
Characteristic Contents
0FUNDS_CTR Node 1 I EQ #
I EQ :
0TCAACTVT I EQ 02
I EQ 03
Not Authorized
All Authorizations Tested
Message EYE007: You do not have sufficient authorization
No Sufficient Authorization for This Subselection (SUBNR)
Following CHANMIDs Are Affected:
206 ( 0FUNDS_CTR )
Authorization Check Complete
We have created custom authorization and trying to restrict based on hierarchy node.
One point I observed is, when I give access to all nodes with a wildcard * in the custom authorization, then the error disappears and the layout is visble. But our point here is to try to restrict based on the nodes and we cannot give display access to all nodes.
Maybe you are looking for
-
While importing XML from iMovie...
Whenever I import an XML file from iMovie and then put it in the timeline, it will not play. I render it so that the bars above the video are gray and everything appears to check out, but the when i click the play button, it will not play and a littl
-
I am thinking of buying a 27" 2560 x 1440 monitor for indesign and photoshop work. It seems to me that the Dell UltraSharp U2713HM is currently best buy. http://www.pcmag.com/article2/0,2817,2411736,00.asp Tell me what you think.
-
Context search in compressed files
Is there any way of searching for content in compressed files? For example: if I have five text files in a zip file (and this zip is in iFS), iFS is able to make a context search inside those text files? In our application, the context search works g
-
ORARRP 1.3.4 Remort Printing Issue in Oracle Form 10g.
Dear All; Andreas Weiden You use this so Please help me regarding this matter it is very urgent. I am trying to use "Oracle Reports Remote Printing Utility 1.3.4 (ORARRP)". Here is the code that I am using to call the report. In "WHEN-BUTTON-PRESSED"
-
Ken Burn Problem in latest iMovie
I've just updated iMovie with the Mavericks update. When I create a ken burn on a photo it plays back jerky and not smooth as per the previous version. Any ideas? I'm on a 2013 mac mini.