Query issue with exception aggregation
Dear all,
I have to solve the following reporting issue with BEx:
Cube Structure:
Cube A:
Characteristics: Company Code, Article, Distribution Channel, Customer, FiscalYear/Period
Key-Figures: Sales Val., Sales Qty.
Cube B:
Characteristics: Company Code, Article, FiscalYear/Period
Key-Figures: COGS
I simply want to multiply: Sales Qty@COGS = NODIM(Sales Qty) * COGS,
but this calculation should be valid for all characteristics of my cube A, even if I do not have them available in Cube B (like Customer and Distribution Channel). Additionally the calculated totals of my characteristics must be correct so that I can apply in second step a Margin Calculation: Sales Val. - Sales Qty@COGS which is valid on single level, as well as on total level.
I started to setup calculated key-figures
COGS1 = NODIM(Sales Qty) * COGS with Exception aggregation TOTAL by Company Code
COGS2 = COGS1 with Exception Aggregation TOTAL by Article
and this worked fine for both characteristics. If I use COGS2 in my report I get correct figures in a drilldown by Company Code and Article.
I enhanced the calculation by
COGS3 = COGS2 with Exception Aggregation TOTAL by Distribution Channel, but the result in this case is 0. I guess the result is 0, as the characteristic Distribution Channel is not availble in Cube B.
Any ideas how to solve this? Is there probably a different (more elegant) approach, as I fear that having all the exception aggregations my query runtime/ressource consumption will be awful.
Thanks for any idea,
Andreas
Hi,
You should define a new selection for COGS having Constant Selection on DC as defined in following link for PRICE with CUSTOMER.
[http://help.sap.com/saphelp_nw70/helpdata/en/46/91f0f090ea13e8e10000000a155369/content.htm]
and then apply your formulas....
hope it will solve the problem...
Regards.
Similar Messages
-
Query Performance with Exception aggregation
Hello,
My Query Keyfigures has exception aggregation on order line level as per requirement.
Currently cube holds 5M of records, when we run query its running more than 30min.
We cont remove exception aggregation.
Cube is alredy modeled correctly and we dont want to use the cache.
Does anybody can please advice if there is any other better approach to improve query performance with exception agg?
ThanksHi,
We have the same problem and raised an OSS ticket. They replied us with the note 1257455 which offers all ways of improving performance in such cases. I guess there s nothing else to do, but to precalculate this exception aggregated formula in data model via transformations or ABAP.
By the way, cache can not help you in this case since exc. agg. is calculated after cache retrieval.
Hope this helps,
Sunil -
Query Problem With Exception Aggregation
Hello,
I have two I/O 0Customer and 0Cont_Acct and also I have a Infoset made out of both. Here is the problem i am trying to solve,
I want to figure out the customers who has more than five contract accounts associated with each customer number.
Here is the solution I approached....
1) Created a formula variable counter on the KF # of data records (which is provided in the Infoset automatically ) with exception aggregation total while referencing the contract acct I/O.
2) Kept my customer # and Contract acct on my rows area.
The results I am getting is less than satisfying. I am expecting this output below.
Customer # Cont Acct Total No of Cont.acct
A 1001 1
1002 1
1003 1
3
B 1007 1
I would appreciate anybody's help in this. Thanks a lot.
-AHi,
We have the same problem and raised an OSS ticket. They replied us with the note 1257455 which offers all ways of improving performance in such cases. I guess there s nothing else to do, but to precalculate this exception aggregated formula in data model via transformations or ABAP.
By the way, cache can not help you in this case since exc. agg. is calculated after cache retrieval.
Hope this helps,
Sunil -
Forumla Variable with Exception Aggregation
Hello,
I've CKF in BEx, in this CKf there is replacement path Formula variable calculating number of documents.
Formula variable is replacing the value of Document number infoobject.
So definition of CKF is -> Replacement path Formula variable *1
Ckf is restricted to summation with Exception aggregation on Document number.
This implementation is working fine but issue is its taking lot of time in query ,like while checking RSRT olap :data transfer(Event ID) is taking lot of time,majority of the query execution time and counter is going somewhere around 100K.
Do you know is there a way we can implement this count in transformation?
Thanks in Advance!Hi,
Doing SUM in transformation takes a long time as you need to read all the content in target.
So it is better to fine tune the query.
1) Is query running long time for small selections ?
2) Trace the query in RSRT and check where it is taking long time like Data base level , Application level
Thanks
Jagan -
Issue with Exceptions in Reporting
Hi BW Gurus,
I am still new to BW, Like I have a issue with exceptions in reporting.
Let me explain clearly, I created a report which gives me two MRP values they are for example V1 and PD(there other MRP types but I am focusing only this two). Here I can see in my report those two values and the overall result of all MRP types (they are 6 different types). My report looks like this for examply
V1 46,782,434(this are rough figures)
PD 56,232,453
Overall Result 102,434,111
Here my requirement is I need to find this overall result like accroding to thresold values this overall result shows RED,YELLOW,GREEN.
I have given threshold values in exceptions. like
0 -
> 45,000,000(Red)
45,000,001 -
> 65,000,000 (Yellow)
65,000,001 -
> 500,000,000 (Green)
But according to this values I should get Red( Overall Result), I cant see any change in the overall result.
Please can some one give advice how to reflect this threshold values in my report.Or I have to select any other options in exceptions.
Thanks in advance.
Regds
SamNot really possible with BEx functions, but if you provide a workbook to users you can just type in this information in a non-Bex cell and save the workbook.
Else you can try to name your exception such that it shows the value ranges. Then in the report, from the toolbar Layout > Display Exceptions.
Hope this helps... -
Non Cumulative KF VS Normal KF with exception aggregation
What is the purpose that is served by "non-cumulative key figures" and is not served by "normal key figures with exception aggregation"?
Inflow and Outflow can also be mapped into a single key figure at update rules level.
Why un-necessary complications like two additional KFs and marker update etc?
Why can't we store actual inventory value on daily basis in a NORMAL key figure with exception aggregation on time?
Also, I heard that data amount will be huge if you don't use non-cumulative key figure. I am not able to understand why?
I HAVE ALREADY GONE THROUGH ALL THE HELP DOCUMENTATION AND ALL THE PREVIOUS THREADS. I DID NOT GET ANSWER TO THESE QUESTIONS.
THANKS IN ADVANCE,
AnupAnup,
Q1: Why can we not store the actual inventory against the person -
In this case you will end up storing day wise information of stock for the person / material - this will lead to increase of your data volumes depending on the scenario.
Example :
Day Level stock only
Record Structure :
Material | End of day snapshot Stock| date
pipe | 100|20.2.2008
pipe|100|21.02.2008
pipe|100|22.02.2008
pipe|80|23.02.2008
Here you can notice that the material pipe has been sold on 23.03.3008 and no sale has happened previously - however since your extractor will give you end of day stock for all materials you will get 2 additional records
In an opening and closing stock scenario
Opening stock
Marker|
pipe|100|20.02.2008
opening stock|closing stock
pipe|100|100
pipe|100|100
pipe|100|80
Here the values will get overwritten and compared with the marker and calculate that the stock is 80 or 100 depending on the date. I am not able to give a proper example but hope I have addressed it in part.
Also it is a Non *** KF because if you look at stock for pipe over 3 days you will get 300 in the first scenario - this is wrong
and for the next scenario with opening and closing stock
it will ideally be
100 ( your marker) -100100-100100-100+80 = 80
this calculation is done at runtime and gives the right output.
In the other case you can check stock only on a date basis and with the assumption that the stock data is current for that day.
Hope it addresses youor question.
Arun
Day 1: -
Problem with Exceptional Aggregation
Hi Gurus
I have problem with Exceptional Aggregation.I have a cube which as following data
Plant , Fromcostcenter,ToCostcenter, g/l,fiscalyear/period, allocation %,amount
1 101234 102355 888881 001/2006 10 1000
1 101234 102355 888881 002/2006 10 1000
1 101234 102345 888882 001/2006 10 1000
1 101234 102346 888893 001/2006 90 1000
1 101234 102346 888893 002/2006 90 1000
1 101234 102346 888984 001/2006 90 1000
I need summarized report as follows
Fromcostcenter tocostcenter allocation%
101234 102355 10
101234 102346 90
But iam getting report as follows
101234 102355 30
101234 102355 270
The allocation% keyfigure should not be accumulated, so for this i specified Exceptional aggregation but not worked.
Any Suggestions,please help me.
Thanks
RaghuHi
I am also having same problem. Did you get a solution for your problem? Can you please tell me how did u manage to display 10 and 90 instead of 30 and 270?
I also tried exceptional aggregation if more than one value. but not working.
Thanks -
Problem with exception aggregation
Hi forums,
Can any one help in this issue?
We checked the data in the ECOM cube for the order 63349312 where it is giving 6 records eventhough we are deleting the adjacent duplicates of the order number in the start routine. (Is it like the deletion of adjacent duplicates happen in data packet level instead of happening in the entire cube level? B'cos the same order might present in the different packets.)
Hence the data is aggregating for the key figure, 'Number of Days' in the ECOM report. But if we check the keyfigure properties in the RSA1, there we selected the Exception Aggregation as 'First Value' and aggregation reference char as 'ZN_CORD' (Order number). Still it is aggregating the number of days value in the report.
Regards,
Charan.Hi Rohit,
We are doing same procedure as you mentioned. We assigned order number as refer characteristic to keyfigure No. of days in infi object level. But this operation is not working in report level
Order number ZCUDATE ZN_GSTRP ZN_CORDKF ZN_DAYSKF
63349312 18.01.2009 01.10.2008 1,000 109
63349312 18.01.2009 01.10.2008 1,000 109
63349312 18.01.2009 01.10.2008 1,000 109
63349312 18.01.2009 01.10.2008 1,000 109
63349312 18.01.2009 01.10.2008 1,000 109
63349312 18.01.2009 01.10.2008 1,000 109
I want o/p as
Order number ZCUDATE ZN_GSTRP ZN_CORDKF ZN_DAYSKF
63349312 18.01.2009 01.10.2008 1,000 109
but it is showing as
Order number ZCUDATE ZN_GSTRP ZN_CORDKF ZN_DAYSKF
63349312 18.01.2009 01.10.2008 1,000 676 -
Query Issue with select level from dual
Hi,
I have a question regarding this query. The problem seems that when selecting level from dual while including another table the rows returned seem to increase exponentially.
I can add distinct and get the correct number of rows but am worried that this will cause a possible performance issue. Am I using the level option wrong?
I have included details below.
There are 4 rows in tbl_incidents
When I run the following queries I get rows returned based on the total number of rows
select start_date + level - 1, tbl_incidents.incident_id, level
from dual, tbl_incidents
where incident_id = 6
connect by level <= 1;
returns 1 row
select start_date + level - 1, tbl_incidents.incident_id, level
from dual, tbl_incidents
where incident_id = 6
connect by level <= 2;
returns 5 rows
select start_date + level - 1, tbl_incidents.incident_id, level
from dual, tbl_incidents
connect by level <= 3 and incident_id = 6;
returns 21 rows
select start_date + level - 1, tbl_incidents.incident_id, level
from dual, tbl_incidents
connect by level <= 4 and incident_id = 6;
returns 85 rows
select start_date + level - 1, tbl_incidents.incident_id, level
from dual, tbl_incidents
connect by level <= 5 and incident_id = 6;
returns 341 rows
So with
r being the number of rows in tbl_incidents and
l being the number used in the connect by for level and
q being the number of rows returned by the query
it appears that
q(l) = r * q(l-1) + 1
level 2: 4 * 1 + 1 = 5
level 3: 4 * 5 + 1 = 21
level 4: 4 * 21 + 1 = 85
level 5: 4 * 85 + 1 = 341
Thanks much,
NoraHi,
The dual table is used when you want to do something in SQL when you are not otherwise using a table.
Generating a "counter table" of the integers 1, 2, 3,..., X is an example
SELECT LEVEL AS n
FROM dual
WHERE LEVEL <= x;There is never any point in joining dual to another table, as in
select start_date + level - 1
, tbl_incidents.incident_id
, level
from dual
, tbl_incidents
where incident_id = 6
connect by level <= x;You will always get the same more easily by just eliminating dual:
select start_date + level - 1
, incident_id
, level
from tbl_incidents
where incident_id = 6
connect by level <= x;It is quite useful and common to join a counter-table to a real table, like this cross-join:
WITH counter_table AS
SELECT LEVEL AS n
FROM dual
WHERE LEVEL <= x
select start_date + n - 1
, incident_id
, n
from tbl_incidents
, counter_table
where incident_id = 6 -
Query: Issue with Output
Does anyone by anychance have any suggestions or thoughts?
Oracle: 10.2g
Data:
with t as (
select '39' time, to_date('11/14/2003 7:39:00 PM','MM/DD/RRRR hh:mi:ss pm') game_date,11122 team_id, 11122 home_team_id,null away_team_id,'parks' home_team_name, null away_team_name,'Steve' offensive_player_moniker, 'Smith' offensive_player_last_name, '7' play_result_id, '11' event_code_num, '11111' game_code from dual union all
select '32', to_date('11/14/2003 7:32:00 PM','MM/DD/RRRR hh:mi:ss pm'),24555, null, 24555, null, 'bombers', 'Chris', 'Carter', '7','11','11111' from dual union all
select '35',to_date('12/14/2008 1:35:00 PM','MM/DD/RRRR hh:mi:ss pm'),33211, null, 33211, null, 'carts', 'Marty', 'Booker','7','17','22222' from dual union all
select '30',to_date('12/14/2008 1:30:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555, 15555, null, 'sharks', null, 'Bob', 'Sanders','7','17','22222' from dual union all
select '38',to_date('12/14/2008 1:38:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','19','22222' from dual union all
select '44',to_date('12/14/2008 1:44:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','17','22222' from dual union all
select '26',to_date('12/14/2008 1:26:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555,15555, null, 'sharks', null, 'Marvin','Harrison','7','19','22222' from dual union all
select '38',to_date('12/16/2008 1:38:00 PM','MM/DD/RRRR hh:mi:ss pm'),32111, 32111, null, 'carts', null, 'Marty', 'Booker','7','17','33333' from dual union all
select '39',to_date('12/16/2008 1:39:00 PM','MM/DD/RRRR hh:mi:ss pm'),15555, null, 15555, null, 'sharks', 'Marvin','Harrison','7','11','33333' from dual
)Query
select substr(x, 1, instr(x, ',', -1) - 1) as game_stats,
play_result_id,
game_code
--Bulk Collect into player_results
from
select
(case when event_code_num = 17 then
team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| ' pen), ')).extract('//text()'))
else team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| '), ')).extract('//text()'))
end) x,
play_result_id,
game_code,
event_code_num
from
select
offensive_player_moniker,
offensive_player_last_name,
case
when home_team_id = team_id then home_team_name
when away_team_id = team_id then away_team_name
end) as team_name,
time minute,
team_id team_id,
play_result_id,
game_code as game_code,
event_code_num as event_code_num,
time,
game_date
--Bulk Collect into player_results
from
t
where event_code_num in (11,17,19,28)
and play_result_id = 7
group by
offensive_player_moniker,
offensive_player_last_name,
home_team_id,
away_team_id,
team_id,
home_team_name,
away_team_name,
play_result_id,
game_code,
event_code_num,
time,
game_date
order by time asc
group by team_name, play_result_id, game_code, event_code_num
);current output:
GAME_STATS PLAY_RESULT_ID GAME_CODE
1 carts: Marty Booker (35 pen) 7 22222
2 carts: Marty Booker (38 pen) 7 33333
3 parks: Steve Smith (39) 7 11111
4 sharks: Bob Sanders (30 pen), Marvin Harrison (44 pen) 7 22222
5 sharks: Marvin Harrison (26), Marvin Harrison (38) 7 22222
6 sharks: Marvin Harrison (39) 7 33333
7 bombers: Chris Carter (32) 7 11111
DESIRED OUTPUT
GAME_STATS PLAY_RESULT_ID GAME_CODE
1 carts: Marty Booker (38 pen) 7 33333
2 parks: Steve Smith (39) 7 11111
3 bombers: Chris Carter (32) 7 11111
4 sharks: Bob Sanders (30 pen), Marvin Harrison (26, 30 pen, 38,44 pen) 7 22222
5 carts: Marty Booker (35 pen) 7 22222
6 carts: Marty Booker (38 pen) 7 33333
7 sharks: Marvin Harrison (39) 7 33333-One of the issues is that im forced to do a group by 'event_code_num' whis is in turn causing the issue, but i needed in the output.
-Second issue is concatinating the results of players that scored more than once in the game like Marvin Harrison, above
-Third issue is that i want to sort the time in asscending order in the output of game_stats, like so Marvin Harrison +(26, 30 pen, 38,44 pen)+ and i want the result set to be sorted by home_team first followed by away team for each game.
Edited by: user652714 on Mar 17, 2009 9:51 AM
Edited by: user652714 on Mar 17, 2009 10:20 AMI've gone ahead and added notes as you had suggested frank, and thanks for the suggestions in regards to genral coding!!
Table Columns/Defenitions:
time : the minute during which a player scored
game_date : the date of the game and time at which a player scored
team_id : the teams id
home_team_id : the home teams id
away_team_id : the away teams id
home_team_name : the home team name
away_team_name : the away team name
offensive_player_moniker : offensive players first name
offensive_player_last_name : offensive players last na,e
play_result_id : the number of play that resulted
7 = Scored
event_code_num : the code number for a type of an event
11 = Scored / 17 = Penalty / 19 = Scored in extra time
game_code : the code of the game
DESIRED OUPUT
You were right my desired output was incorrect here it is again
bombers: Chris Carter (32) 7 11111
parks: Steve Smith (39) 7 11111
sharks: Marvin Harrison (26, 30 pen, 38, 44 pen), Bob Sanders (30 pen) 7 22222
carts: Marty Booker (35 pen) 7 22222
carts: Marty Booker (38 pen) 7 33333
sharks: Marvin Harrison (39) 7 33333
Breakdown of output:
sharks: Marvin Harrison (26, 30 pen, 38, 44 pen), Bob Sanders (30 pen) 7 22222
sharks = team name
Marvin Harrison = player name
(26, 30 pen, 38, 44 pen) = ascending order of minute at which player scored/commited penalty
***Note*** the 'pen' after after the minute value signifies that the player commited a penalty at that minute, (event_code_num = 17)
'7' = play_result_id
'22222' = game_code
Explination of code
--Outputs data and gets rid of extra comma in the output
select substr(x, 1, instr(x, ',', -1) - 1) as game_stats,
play_result_id,
game_code
--Bulk Collect into player_results
from
--subquery that formats player name and time scored/penalty commited into one string
select
(case when event_code_num = 17 then
team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| ' pen), ')).extract('//text()'))
else team_name
|| ': '
||
(xmlagg(xmlelement(e, offensive_player_moniker
|| ' '
|| offensive_player_last_name
|| ' ('
|| to_char(time)
|| '), ')).extract('//text()'))
end) x,
play_result_id,
game_code,
event_code_num
from
--Driving query, populates data
select
offensive_player_moniker,
offensive_player_last_name,
case
when home_team_id = team_id then home_team_name
when away_team_id = team_id then away_team_name
end) as team_name,
time minute,
team_id team_id,
play_result_id,
game_code as game_code,
event_code_num as event_code_num,
time,
game_date
--Bulk Collect into player_results
from
t
where event_code_num in (11,17,19,28)
and play_result_id = 7
group by
offensive_player_moniker,
offensive_player_last_name,
home_team_id,
away_team_id,
team_id,
home_team_name,
away_team_name,
play_result_id,
game_code,
event_code_num,
time,
game_date
order by time asc
group by team_name, play_result_id, game_code, event_code_num
); -
Query Issue with adding a static value
I have written a query in SAP that works the way I need it to. I am going to schedule it so that SAP does a file transfer of the output to another area where the data will be accumulated.
My problem is that I want the name of the system to appear at the end of each line but it is not in any table I can find to match up with the userid.
EX:
UserID Group System
JDOE HR Query Group P201
The field for system is not in any tables but I want to add it and give it a constant/static value all the time in the output.
Any ideas?
Thanks!
Justin Dauby -- SAP Security Analyst
Kimball International Inc.,The name of the system is not in a table, but rather a structure called SYST. You can evaluate it using the system field sy-sysid.
Another option is to call function module RFC_SYSTEM_INFO. The export parameter RFCSI_EXPORT will deliver the system id from component RFCSYSID of the structure RFCSI.
Cheers,
Julius -
Issue with exception Handling in GG
Hi,
I have bi-directional DML replication setup. I have written a code in replication parameter for handling the exception , Exception handling is working fine my replicate process is not getting ABENDED but Issue is I am not geeting any rows in EXCEPTION table.I had gone through replicat report, there I had seen GG is trying to inser duplicate records in EXCEPTION TABLE and it is failing because of that .
**Command for create Exception Table is-**
create table ggs_admin.exceptions (
rep_name varchar2(8) ,
table_name varchar2(61) ,
errno number ,
dberrmsg varchar2(4000) ,
optype varchar2(20) ,
errtype varchar2(20) ,
logrba number ,
logposition number ,
committimestamp timestamp,
CONSTRAINT pk_exceptions PRIMARY KEY (logrba, logposition, committimestamp)
USING INDEX
TABLESPACE INDX1
TABLESPACE dbdat1
My replication parameter is-
GGSCI (db) 1> view params rep2
-- Replicator parameter file to apply changes
REPLICAT rep2
ASSUMETARGETDEFS
USERID ggs_admin, PASSWORD ggs_admin
DISCARDFILE /u01/app/oracle/product/gg/dirdat/rep2_discard.dsc, PURGE
-- Start of the macro
MACRO #exception_handler
BEGIN
, TARGET ggs_admin.exceptions
, COLMAP ( rep_name = "REP2"
, table_name = @GETENV ("GGHEADER", "TABLENAME")
, errno = @GETENV ("LASTERR", "DBERRNUM")
, dberrmsg = @GETENV ("LASTERR", "DBERRMSG")
, optype = @GETENV ("LASTERR", "OPTYPE")
, errtype = @GETENV ("LASTERR", "ERRTYPE")
, logrba = @GETENV ("GGHEADER", "LOGRBA")
, logposition = @GETENV ("GGHEADER", "LOGPOSITION")
, committimestamp = @GETENV ("GGHEADER", "COMMITTIMESTAMP"))
, INSERTALLRECORDS
, EXCEPTIONSONLY;
END;
-- End of the macro
REPERROR (DEFAULT, EXCEPTION)
--REPERROR (-1, EXCEPTION)
--REPERROR (-1403, EXCEPTION)
MAP scr.order_items, TARGET scr.order_items;
MAP scr.order_items #exception_handler();
GGSCI (db) 2>view params rep2
MAP resolved (entry scr.order_items):
MAP "scr"."order_items" TARGET ggs_admin.exceptions , COLMAP ( rep_name = "REP2" , table_name = @GETENV ("GGHEADER", "TABLENAME") , errno = @GETENV ("LASTERR", "DB
ERRNUM") , dberrmsg = @GETENV ("LASTERR", "DBERRMSG") , optype = @GETENV ("LASTERR", "OPTYPE") , errtype = @GETENV ("LASTERR", "ERRTYPE") , logrba = @GETENV ("GGHEADER"
, "LOGRBA") , logposition = @GETENV ("GGHEADER", "LOGPOSITION") , committimestamp = @GETENV ("GGHEADER", "COMMITTIMESTAMP")) , INSERTALLRECORDS , EXCEPTIONSONLY;;
Using the following key columns for target table GGS_ADMIN.EXCEPTIONS: LOGRBA, LOGPOSITION, COMMITTIMESTAMP.
2012-08-30 09:09:00 WARNING OGG-01154 SQL error 1403 mapping scr.order_items to scr.order_items OCI Error ORA-01403: no data found, SQL <DELETE FROM "scr"."order_items" WHERE "SUBSCRIBER_ID" = :b0>.
2012-08-30 09:09:00 WARNING OGG-00869 OCI Error ORA-00001: unique constraint (GGS_ADMIN.PK_EXCEPTIONS) violated (status = 1). INSERT INTO "GGS_ADMIN"."EXCEPTIONS" ("R
EP_NAME","TABLE_NAME","ERRNO","DBERRMSG","OPTYPE","ERRTYPE","LOGRBA","LOGPOSITION","COMMITTIMESTAMP") VALUES (:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8).
2012-08-30 09:09:00 WARNING OGG-01004 Aborted grouped transaction on 'GGS_ADMIN.EXCEPTIONS', Database error 1 (OCI Error ORA-00001: unique constraint (GGS_ADMIN.PK_EX
CEPTIONS) violated (status = 1). INSERT INTO "GGS_ADMIN"."EXCEPTIONS" ("REP_NAME","TABLE_NAME","ERRNO","DBERRMSG","OPTYPE","ERRTYPE","LOGRBA","LOGPOSITION","COMMITTIMES
TAMP") VALUES (:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8)).
2012-08-30 09:09:00 WARNING OGG-01003 Repositioning to rba 92383 in seqno 8.
2012-08-30 09:09:00 WARNING OGG-01154 SQL error 1403 mapping scr.order_items to scr.order_items OCI Error ORA-01403: no data found, SQL <DELETE FROM "scr"."order_items" WHERE "SUBSCRIBER_ID" = :b0>.
2012-08-30 09:09:00 WARNING OGG-01154 SQL error 1 mapping scr.order_items to GGS_ADMIN.EXCEPTIONS OCI Error ORA-00001: unique constraint (GGS_ADMIN.PK_EXCEPTIONS)
violated (status = 1). INSERT INTO "GGS_ADMIN"."EXCEPTIONS" ("REP_NAME","TABLE_NAME","ERRNO","DBERRMSG","OPTYPE","ERRTYPE","LOGRBA","LOGPOSITION","COMMITTIMESTAMP") VAL
UES (:a0,:a1,:a2,:a3,:a4,:a5,:a6,:a7,:a8).
2012-08-30 09:09:00 WARNING OGG-01003 Repositioning to rba 92383 in seqno 8.
When I am running command
select * from exceptions;
no row selected.
Please help. Why duplicat rows trying to insert in Exception table.Remove (disable) the constraint on the exceptions table and see if inserts will take place. Do you really need that primary key?
-
SQL Query issue with large varchar column
I have a SQL Query (PL/SQL function body returning SQL query) which contains a large varchar column. Even if I substring the column to 30chars when it displays on the page it wraps. I have tried putting nowrap="wrap" for the HTML table cell attributes and it still wraps. I have tried setting the width attributes on the column even though it's not an updateable column. Does anyone have any ideas on how prevent this from wrapping. In some cases 1 line will take up 3 because of this wrapping issue and it's not nice to look at. It seems that the column is somewhere set to a fixed width, which is less than 30 characters, and anything beyond this fixed width wraps.
Hi Netha,
Can you please provide the DDLs of three tables you are using,
Also post us how many rows you are getting output for this query?
select * from dim.store st where
st.store_code = 'MAUR'
also try to run and update statement on this table as below and execute your query
update dim.store
set store_code
= ltrim(rtrim(store_code))
where
store_code = 'MAUR'
once you run this update, then run your query. Let us know the result. -
Issue with nested aggregated in time series
Hi Experts,
I have a small requiremnt using time series.
eg : (sum(current year4:Quarters bookingamt)-sum(previous year4:Quarters bookingamt))/sum(previous year4:Quarters bookingamt)
Bookingamt is My measure Name.
how do we implement above logic can please help regarding this.
Thanks,
C MaheshHi Mahesh,
These functions help to obtain the data analyzed based on the time. There are two types of Time Series in OBIEE viz..
1) Ago
2) To Date
Following are the prerequisites and procedure to create time series function in OBIEE RPD.
a) Create a dimensional Hierarchy (ex: YearàQTRàMonth) and convert it to the time dimension. DC on hierarchy and check the box time dimension.
b) Define the Chronological Key. A chronological key should always be the lowest level in the table and the data in it should be in the format of 20090102 or 200109 (year, month format). DC on level (preferred/last—detail) go to KEYS tab and check chronological key.
c) Duplicate the existing column, rename and specify the formula for the column.
1) Ago: This function when specified in any column displays data that is month ago, 2 months ago depending on the specified formula. DC on new column check use logical columns and click on exp. Builder. In the left column select functions and then time series and then AGO. Then in metric go to logical tables and select metric and in level go to time dimension and select month (1 month ago) and give the period as 1(since it is 1 month ago) then transfer two new columns to presentation layer
Ex: sales: 100,200,300 à -, 100, 200 (1 month AGO).
2) To Date: This functions allows us to display the aggregated measures based on the formula specified. DC on new column check use logical columns and click on exp. Builder. In the left column select functions and then time series then select TO DATE. Select metric from logical tables (revenue) and as level go to time dimension and select year(YTD) and OK.
YTD- year to date – Displays aggregated values that is with respect to month since it is YTD. Similarly for QTD and MTD.
Ex: sales: 100,200,300 à YTD Sales: 100,300,600
http://www.rittmanmead.com/2007/04/obi-ee-time-dimensions-and-time-series-calculations/ -- How to implement step by step with screen shots.
http://www.oraclebidwh.com/2009/12/time-series-wizard-in-obiee/ -- you can understand easily how to implement this with screen shot.
Hope it help's
Thanks,
satya -
Query issue with 9i database only
Hi,
I am getting this strage error in 9i database only for the following query. It works perfectly fine in any 8i databases. The query is executed from schema user X with all the referenced tables from schema user B.
ERROR at line 1:
ORA-03113: end-of-file on communication channel
SELECT COUNT (ref_id), gparent_id, ref_id, ref_value
FROM (SELECT (SELECT ref_id
FROM sams_sec_application_setup
WHERE setup_id = a.setup_id) ref_id,
(select ref_id from sams_sec_application_setup where setup_id =
(SELECT parent_id
FROM sams_sec_application_setup
WHERE setup_id =
(SELECT parent_id
FROM sams_sec_application_setup b
WHERE setup_id = a.setup_id))) gparent_id,
(SELECT ref_value
FROM sams_sec_application_setup
WHERE setup_id = a.setup_id) ref_value
FROM sams_sec_security a
WHERE setup_id IN (
SELECT setup_id
FROM sams_sec_application_setup a
WHERE application_id = (SELECT application_id
FROM sams_sec_application
WHERE application_name = 'SOFA')
AND GROUP_ID = (SELECT GROUP_ID
FROM sams_sec_group
WHERE group_name = 'Alliances'))
AND user_id = 'srikulka '
AND entity_id IN (
SELECT a.role_id
FROM sams_sec_role a,
sams_sec_permission b,
sams_sec_role_permission c
WHERE a.role_id = c.role_id
AND b.permission_id = c.permission_id
AND b.permission_name = 'Admin'
AND b.application_id =
(SELECT application_id
FROM sams_sec_application
WHERE application_name = 'SOFA')))
GROUP BY ref_id, gparent_id, ref_value;
The same query executed without any error in 8i databases.
The query is executed from schema user X with all the referenced tables in schema user B.
Thanks and regards,
Ambili3113 is a generic error and there are many known reasons for encountering this error. I suggest you search for related docs on Metalink.
Maybe you are looking for
-
A problem with Threads and loops.
Hi, I have some code that needs to be constantly running, like while(true) //code here }However, the code just checks to see if the user has input anything (and then if the user has, it goes to do some other stuff) so I don't need it constantly
-
Bit more detail... Mac Book Air has been struggling for space for a while so I bough at time capsule a) for back up and b) to host my itunes folder remotely. This didnt' work. I took it into the Genius bar and the genius there told me that time cap
-
Bbm installed but not on the phone
hi i need help. yesrerday i updated bbm on my 9810 torch, after the phone restarted i couldnt find it. when i go to app word it shows as installed but i can not uninstall it. please help as to how i can solve this problem thanks
-
ITunes 10 no volume on Airport Express
I upgraded iTunes to v.10 yesterday, and today I have no volume, or only very low volume, when I try to stream audio to an Airport Express. I checked for firmware updates - none are available (I am running 7.4.2). I rebooted the AE - no luck. I disco
-
E71 - Cannot view Video streams from YouTube
Hey ther, i recently picked up the E71, great phone, must say. However, i have a concern... I tried browsing YouTube & MetaCafe using my WiFi & GPRS connection to view videos of youtube and metacafe. The problem however is, once I click a Video thumb