Loading of data is taking much time.
Data is coming from Oracle DB to a cube . It is taking huge time. After 1 day it is still in yellow state but it is giving a short dump <b>MESSAGE_TYPE_X</b> .
<u><b>In the errror analysis it is giving :</b></u>
<b>Diagnosis
The information available is not sufficient for analysis. You must
determine whether further IDocs exist in SAP BW that have not yet been
processed, and could deliver additional information.
Further analysis:
Check the SAP BW ALE inbox. You can check the IDocs using the Wizard or
the "All IDocs" tabstrip</b>
<u><b>Current status :</b></u>
No selection information arrived from the source system.
I have checked Syatem log and found same error. Moreover the RFC connection is OK.
Please suggest.
Rajib,
What I mean is load to PSA (only PSA in Infopackage) and do not check the box "update subsequently..."
When all data is in the PSA, then load it to the cube.
(manually by clicking on the button in the monitor tab "Status")
This means that the system has more resources available to do jst one step.
Udo
Similar Messages
-
Taking much time for loading data
Dear All
While loading data in <b>PRD system</b> (for master data and transaction data), it's taking much time. For ex., 2LIS_05_QOITM, I am loading delta data only in <b>PSA</b> from R/3. Some times (yesterday) it's taking 2 mins and some times (today) it's taking 5 hrs not yet completed (it's yellow). Actually yesterday, we went to SM58 in R/3 side and executed that LUW for some other data sources. Now also we can do but we don't want to like that. Because we are expecting permanent soulution. Could u please advice me. I am getting below message in status tab
Errors while sending packages from OLTP to BI
Diagnosis
No IDocs could be sent to BI using RFC.
System Response
There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of BI.
Further analysis:
Check the TRFC log.
You can access this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
Error handling:
If the TRFC is incorrect, check whether the source system is fully connected to BI. In particular, check the authorizations of the background user in the Errors while sending packages from OLTP to BI
Diagnosis
No IDocs could be sent to BI using RFC.
System Response
There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of BI.
Further analysis:
Check the TRFC log.
You can access this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
Error handling:
If the TRFC is incorrect, check whether the source system is fully connected to BI. In particular, check the authorizations of the background user in the source system.
I am loading data through Process chain and user is <b>BWREMOTE (authorized user).</b>
Please help me.
Thanks a lot in advance
RajaDear Karthik
No I couldn't resolve till now
But Everything is fine.
Now status is yellow only (209 from 209). Now what i want to do.
Im getting below message <b>in status tab</b>
Missing data packages for PSA Table
Diagnosis
Data packets are missing from PSA Table . BI processing does not return any errors. The data transport from the source system to BI was probably incorrect.
Procedure
Check the tRFC overview in the source system.
You access this log using the wizard or following the menu path "Environment -> Transact. RFC -> Source System".
Error handling:
If the tRFC is incorrect, resolve the errors listed there.
Check that the source system is connected properly to BI. In particular, check the remote user authorizations in BI.
<b>In detail tab</b>, I am getting below message
Info IDoc 2 : sent, not arrived ; IDoc ready for dispatch (ALE service)
Thanks in advance
Raja -
Query taking much time Orace 9i
Hi,
**How can we tune the sql query in oracle 9i.**
The select query taking more than 1 and 30 min to throw the result.
Due to this,
We have created materialsed view on the select query and also we submitted a job to get Materilazed view refreshed daily in dba_jobs.
When we tried to retrive the data from Materilased view getting result very quickly.
But the job which we has been assisgned in Dbajobs taking equal time to complete, as the query use to take.
We feel since the job taking much time in the test Database and it may cause load if we move the same scripts in Production Environment.
Please suggest how to resolvethe issue and also how to tune the sql
With Regards,
Srinivas
Edited by: Srinivas.. on Dec 17, 2009 6:29 AMHi Srinivas;
Please follow this search and see its helpful
Regard
Helios -
Taking much time when trying to Drill a characterstic in the BW Report.
Hi All,
When we are executing the BW report, it is taking nearly 1 to 2 mins then when we are trying to drill down a characterstic it is taking much time nearly 30 mins to 1 hour and througing an error message that,
"An error has occared during loading. Plese look in the upper frame for further information."
I have executed this query in RSRT and cheked the query properties,
this quey is bringing the data directly form Aggregates but some chares are not avalable in the Agrregtes.
So... after execution when we are trying to drill down the chars is taking much time for chars which are not avilable in the Aggregates. For other chars which are avilable in the Aggregates it is taking only 2 to 3 mins only.
How to do the drill down for the chars which are not avilable in the Aggregates with out taking much time and error.
Could you kindly give any solution for this.
Thanks & Regards,
Raju. EHi,
The only solution is to include all the char used in the report in the aggregates or this will the issue you will face.
just create a proposal for aggregates before creating any new aggregates as it will give you the idea which one is most used.
Also you should make sure that all the navigation characteristics are part of the aggregates.
Thanks
Ajeet -
Database tableAUFM hitting is taking much time even secondary index created
Hi Friends,
There is report for Goods movement rel. to Service orders + Acc.indicator.
We have two testing Systems(EBQ for developer and PEQ from client side).
EBQ system contains replica of PEQ every month.
This report is not taking much time in EBQ.But it is taking much time in PEQ.For the selection criteria I have given,both systems have same data(Getting same output).
The report has the follwoing fields on the selection criteria:
A_MJAHR Material Doc. Year (Mandatory)
S_BLDAT Document Date(Optional)
S_BUDAT Posting Date(Optional)
S_LGORT Storage Location(Optional)
S_MATNR Material(Optional)
S_MBLNR Material Documen(Optional)t
S_WERKS Plant(Optional)
Client not agrrying to make Material Documen as Mandatory.
The main (first) table hit is on AUFM table .As there are non-key fileds as well in where condition,We have cretaed a secondary index as well for AUFM table on the following fields:
BLDAT
BUDAT
MATNR
WERKS
LGORT
Even then also ,in PEQ sytem the report is taking very long time ,Some times not even getting the ALV output.
What can be done to get teh report executed very fast.
<removed by moderator>
The part of report Soure code is as below:
<long code part removed by moderator>
Thanks and Regards,
Rama chary.P
Moderator message: please stay within the 2500 character limit to preserve formatting, only post relevant portions of the code, also please read the following sticky thread before posting.
Please Read before Posting in the Performance and Tuning Forum
locked by: Thomas Zloch on Sep 15, 2010 11:40 AMHi Friends,
There is report for Goods movement rel. to Service orders + Acc.indicator.
We have two testing Systems(EBQ for developer and PEQ from client side).
EBQ system contains replica of PEQ every month.
This report is not taking much time in EBQ.But it is taking much time in PEQ.For the selection criteria I have given,both systems have same data(Getting same output).
The report has the follwoing fields on the selection criteria:
A_MJAHR Material Doc. Year (Mandatory)
S_BLDAT Document Date(Optional)
S_BUDAT Posting Date(Optional)
S_LGORT Storage Location(Optional)
S_MATNR Material(Optional)
S_MBLNR Material Documen(Optional)t
S_WERKS Plant(Optional)
Client not agrrying to make Material Documen as Mandatory.
The main (first) table hit is on AUFM table .As there are non-key fileds as well in where condition,We have cretaed a secondary index as well for AUFM table on the following fields:
BLDAT
BUDAT
MATNR
WERKS
LGORT
Even then also ,in PEQ sytem the report is taking very long time ,Some times not even getting the ALV output.
What can be done to get teh report executed very fast.
<removed by moderator>
The part of report Soure code is as below:
<long code part removed by moderator>
Thanks and Regards,
Rama chary.P
Moderator message: please stay within the 2500 character limit to preserve formatting, only post relevant portions of the code, also please read the following sticky thread before posting.
Please Read before Posting in the Performance and Tuning Forum
locked by: Thomas Zloch on Sep 15, 2010 11:40 AM -
ODS Activation is taking much time...
Hi All,
Some times ods activation is taking much time. Generally it takes 30 mins and some times it take 6 hours.
If this activation is taking much, then if i check sm50 ...i can see that there is a piece of code is taking much time.
SELECT
COUNT(*) , "RECORDMODE"
FROM
"/BIC/B0000814000"
WHERE
"REQUEST" = :A0 AND "DATAPAKID" = :A1
GROUP BY
"RECORDMODE"#
Could you please let me know what are the possiblites to solve this issue.
thanksHello,
you have 2 options:
1) as already mentioned, cleanup some old psa data or change log data from this psa table or
2) create a addtional index for recordtype on this table via Tcode se11 -> indexes..
Regards, Patrick Rieken. -
Query taking much time.
Hi All,
I have one query which is taking much time in dev envi where data size is very small and planning to implement this query in production database where database size is huge. Plz let me know how I can optimize this query.
select count(*) from
select /*+ full(tls) full(tlo) parallel(tls,2) parallel(tls, 2) */
tls.siebel_ba, tls.msisdn
from
TDB_LIBREP_SIEBEL tls, TDB_LIBREP_ONDB tlo
where
tls.siebel_ba = tlo.siebel_ba (+) and
tls.msisdn = tlo.msisdn (+) and
tlo.siebel_ba is null and
tlo.msisdn is null
union
select /*+ full(tls) full(tlo) parallel(tls,2) parallel(tls, 2) */
tlo.siebel_ba, tlo.msisdn
from
TDB_LIBREP_SIEBEL tls, TDB_LIBREP_ONDB tlo
where
tls.siebel_ba (+) = tlo.siebel_ba and
tls.msisdn (+) = tlo.msisdn and
tls.siebel_ba is null and
tls.msisdn is null
explain plan of above query is
| Id | Operation | Name | Rows | Bytes | Cost | TQ |IN-OUT| PQ Distrib |
| 0 | SELECT STATEMENT | | 1 | | 14 | | | |
| 1 | SORT AGGREGATE | | 1 | | | | | |
| 2 | SORT AGGREGATE | | 1 | | | 41,04 | P->S | QC (RAND) |
| 3 | VIEW | | 164 | | 14 | 41,04 | PCWP | |
| 4 | SORT UNIQUE | | 164 | 14104 | 14 | 41,04 | PCWP | |
| 5 | UNION-ALL | | | | | 41,03 | P->P | HASH |
|* 6 | FILTER | | | | | 41,03 | PCWC | |
|* 7 | HASH JOIN OUTER | | | | | 41,03 | PCWP | |
| 8 | TABLE ACCESS FULL| TDB_LIBREP_SIEBEL | 82 | 3526 | 1 | 41,03 | PCWP | |
| 9 | TABLE ACCESS FULL| TDB_LIBREP_ONDB | 82 | 3526 | 2 | 41,00 | S->P | BROADCAST |
|* 10 | FILTER | | | | | 41,03 | PCWC | |
|* 11 | HASH JOIN OUTER | | | | | 41,03 | PCWP | |
| 12 | TABLE ACCESS FULL| TDB_LIBREP_ONDB | 82 | 3526 | 2 | 41,01 | S->P | HASH |
| 13 | TABLE ACCESS FULL| TDB_LIBREP_SIEBEL | 82 | 3526 | 1 | 41,02 | P->P | HASH |
Predicate Information (identified by operation id):
6 - filter("TLO"."SIEBEL_BA" IS NULL AND "TLO"."MSISDN" IS NULL)
7 - access("TLS"."SIEBEL_BA"="TLO"."SIEBEL_BA"(+) AND "TLS"."MSISDN"="TLO"."MSISDN"(+))
10 - filter("TLS"."SIEBEL_BA" IS NULL AND "TLS"."MSISDN" IS NULL)
11 - access("TLS"."SIEBEL_BA"(+)="TLO"."SIEBEL_BA" AND "TLS"."MSISDN"(+)="TLO"."MSISDN")user3479748 wrote:
Hi All,
I have one query which is taking much time in dev envi where data size is very small and planning to implement this query in production database where database size is huge. Plz let me know how I can optimize this query.
select count(*) from
select /*+ full(tls) full(tlo) parallel(tls,2) parallel(tls, 2) */
tls.siebel_ba, tls.msisdn
from
TDB_LIBREP_SIEBEL tls, TDB_LIBREP_ONDB tlo
where
tls.siebel_ba = tlo.siebel_ba (+) and
tls.msisdn = tlo.msisdn (+) and
tlo.siebel_ba is null and
tlo.msisdn is null
union
select /*+ full(tls) full(tlo) parallel(tls,2) parallel(tls, 2) */
tlo.siebel_ba, tlo.msisdn
from
TDB_LIBREP_SIEBEL tls, TDB_LIBREP_ONDB tlo
where
tls.siebel_ba (+) = tlo.siebel_ba and
tls.msisdn (+) = tlo.msisdn and
tls.siebel_ba is null and
tls.msisdn is null
) ;explain plan of above query is
| Id | Operation | Name | Rows | Bytes | Cost | TQ |IN-OUT| PQ Distrib |
| 0 | SELECT STATEMENT | | 1 | | 14 | | | |
| 1 | SORT AGGREGATE | | 1 | | | | | |
| 2 | SORT AGGREGATE | | 1 | | | 41,04 | P->S | QC (RAND) |
| 3 | VIEW | | 164 | | 14 | 41,04 | PCWP | |
| 4 | SORT UNIQUE | | 164 | 14104 | 14 | 41,04 | PCWP | |
| 5 | UNION-ALL | | | | | 41,03 | P->P | HASH |
|* 6 | FILTER | | | | | 41,03 | PCWC | |
|* 7 | HASH JOIN OUTER | | | | | 41,03 | PCWP | |
| 8 | TABLE ACCESS FULL| TDB_LIBREP_SIEBEL | 82 | 3526 | 1 | 41,03 | PCWP | |
| 9 | TABLE ACCESS FULL| TDB_LIBREP_ONDB | 82 | 3526 | 2 | 41,00 | S->P | BROADCAST |
|* 10 | FILTER | | | | | 41,03 | PCWC | |
|* 11 | HASH JOIN OUTER | | | | | 41,03 | PCWP | |
| 12 | TABLE ACCESS FULL| TDB_LIBREP_ONDB | 82 | 3526 | 2 | 41,01 | S->P | HASH |
| 13 | TABLE ACCESS FULL| TDB_LIBREP_SIEBEL | 82 | 3526 | 1 | 41,02 | P->P | HASH |
Predicate Information (identified by operation id):
6 - filter("TLO"."SIEBEL_BA" IS NULL AND "TLO"."MSISDN" IS NULL)
7 - access("TLS"."SIEBEL_BA"="TLO"."SIEBEL_BA"(+) AND "TLS"."MSISDN"="TLO"."MSISDN"(+))
10 - filter("TLS"."SIEBEL_BA" IS NULL AND "TLS"."MSISDN" IS NULL)
11 - access("TLS"."SIEBEL_BA"(+)="TLO"."SIEBEL_BA" AND "TLS"."MSISDN"(+)="TLO"."MSISDN")
I dunno, it looks like you are getting all the things that are null with an outer join, so won't that decide to full scan anyways? Plus the union means it will do it twice and do a distinct to get rid of dups - see how it does a union all and then sort unique. Somehow I have the feeling there might be a more trick way to do what you want, so maybe you should state exactly what you want in English. -
While saving the Ai file its taking much time
I am using windows 7 and Ai CC, when i am working on files while saving the Ai file its taking much time. I got 4 GB RAM. now i got better system too with ram of 6GB. stillthe Ai is taking much time to save 300 MB file.
Thank you Jdanek.
1. I am saving the file in my local HDD only.
2. Scratch disk is F Drive [100 GB free space], there no data is stored.
3. Even i increased the Virtual memory too.
4. Now I switched off the "Save with PDF compatability" too, No luck! -
Discoverer report is taking much time to open
Hi
All the discoverer report are taking much time to open,even query in lov is taking 20 -25 min.s.We have restart the services but on result found.
Please suggest what can be done ,my application is on 12.0.6.
RegardsThis topic was discussed many times in the forum before, please see old threads for details and for the docs you need to refer to -- https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Slow&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
Thanks,
Hussein -
LOV is slow after selecting a value its taking much time to default
Hi,
I have a dependent LOV. Master LOV is executing fine and its populatin into the field fastly. But Child LOV is very slow after selecting a value its taking much time to default.
Can any one please help me if there is any way to default the value fast after selecting a value?
Thanks,
MaheshHi Gyan,
Same issues in TST and PROD instances.
In my search criteria if i get 1 record, even after selecting that value its taking much time to default that value into the field.
Please advice. Thanks for your quick resp.
Thanks,
Mahesh -
Hi < i have updated my mac 10.5 with 10.5.8 sucessfully , but while instaling it is taking much time and status bar is not showing any progress
If I remember correctly one of the updates could hang after doing the update. Fixed by a restart but installing a combo update over the top of others rarely does any harm and it may be the best thing to do.
-
Adding column is taking much time. How to avoid?
ALTER TABLE CONTACT_DETAIL
ADD (ISIMDSCONTACT_F NUMBER(1) DEFAULT 0 NOT NULL
,ISREACHCONTACT_F NUMBER(1) DEFAULT 0 NOT NULL
Is there any way that to speed up the execution time of the query?
It's more than 24 hrs completed after started running the above script.
I do not know why it is taking much time.
Size of the table is 30 MB.To add a column the row directory of every record must be rewritten.
Obviously this will take time and produce redo.
Whenever something is slow the first question you need to answer is
'What is it waiting for?' You can do so by investigating by various v$ views.
Also, after more than 200 'I can not be bothered to do any research on my own' questions, you should know you don't post here without posting a four digit version number and a platform,
as volunteers aren't mind readers.
If you want to continue to withheld information, please consider NOT posting here.
Sybrand Bakker
Senior Oracle DBA
Experts: those who did read documentatiion and can be bothered to investigate their own problems. -
Can u give clear steps how to load 3 data target at a time by the help of p
can u give clear steps how to load 3 data target at a time by the help of parllalesim
hi,
create the load infopackage type and give the infopack you need to load.
create 3 similar process types.
chain should have this flow
start -> delet index(if cube is the target) -> load target(connect the load process to start process) parallel -> and process -> create index(if cube)/actiavtion of ods(if ods)
Ramesh -
0CSM_USER_TEXT taking much time to load
Its related to 0CSM_USER_TEXT which isn’t a Retail Datasource/Object so I’m to assume it’s a FI&Gen Job. The job currently produces 200,000 records over an 8 hour period. Firstly it used to finish in 8 sec for some number of records those were Not more than 10,000.
Please tell me solution why the loading is taking so much of time. Because of this whole Process Chain is taking time to load.Hi,
Assuming as its taking more time while loading to psa.
if yes then please check your job log at source side.
Go to your info pack monitor--> menu environment--> job overview--> job in the source system--> it will prompts for source user id and password. enter them and you will egt your job.select job and lick on log details icon from tool bar. there you can see your job in detailed steps.
Thanks -
After Upgrade BI Initial load is taking much time
Dear Friends,
We had BW 3.5 on Windows 2003(32 bit) & SQL 2000.
I upgraded it to BI 7.01 (EHP1 SR1) with Windows 2003(64bit),SQL 2005 & completed all followup activities.
Now when we are doing initial load it is taking long time. Please do let me know your inputs as soon as possible.
Regards,
Sunil Maurya.Hi ,
I created the thread under netweaver forum.
But still I'll check & try create it in correct forum.
Regards,
Sunil Maurya
Maybe you are looking for
-
Re: how can i print from my samsung galaxy tab to my hp printer
have downloaded e print to my Samsung Galaxy 3 Tablet, have instaled, but Tablet can't find the printer Office Jet J4580 - no printer available in this Network?? is the answer I see, finds services over 100km away though Have no problems printing fro
-
When using applications how much usage are you using or are you using none
ive downloaded some free applications on to my iphone lie (facebook, myspace, webmessenger) when actually usually the applications does that use the usage that i get monthly how does it work.
-
'Plant Location' field in dynamic selection of report
Hi Experts, I have entered some value in plant location in WBS org. data tab and want to filter the report by giving this field in dynamic selection of reports. But this field is not availble in dynamic selection. how we can add the same, even this i
-
Change the font of particular cell in matrix report
hi experts i made a report in report 6i. now i want to change a color of particular cell which has shortest value. and also i want to fill 0 ( zero ) value in blank column. i have tried it from the property pallate " Value if Null " set to 0 (zero )
-
HT201210 Software update problem...can anyone help?
I am trying to update the software but there is an error that tells me that the iPhone is not connected. However, it is. Does anyone have any advice