Data Load taking more than 24 hrs
Hi All,
We are in the process of loading the data from R3 to BW. Its taking more than 24hrs to load 1 yr data. This is because of complexity of ABAP code its taking this much time. We did all the possible ways to improve the performance of the code. But no luck.
In case if the same thing happened in the production system, then how should I forward with the data load as its taking more than 24 hrs for 1 yr data.
I m Planning to load int with out data transfer 1st and then Full load.
Please correct if I m wrong.
Thanks,
RS.
Hi,
where is your ABAP code complexity loacted? in R/3 or in BW?
Are you talking about loading in empty cube taking long time? extraction time?
Analyze the different steps in your monitor and tell us where is the bottleneck;
If you already know the above and you have performed all the tunings (e.g. number range buffereing when filling an empty cube...) then you're correct; init without data and then full loads.
As suggested you could segment your full loads and even run them in paralel...
hope this helps...
Olivier.
Similar Messages
-
Sales data load taking more time
Hi ,
I am loading 1 year sales data from setiup tables but it is taking more than 3 years .
One more thing is no ABAP routines only extraction from r/3 and loading it into ODS ....
Is there any i can speed to load the data into ODS .
Please any can help meHi,
Make sure that you have deleted index in the Data Target before you load. Also put different selection criteria in the infopackages and load them parallely.
Make sure when the laod is running no other major process is running in the system. Yu can check this using SM50 transaction. -
My iphone4 is taking more than 5 hours to erase all the data still it is under process , still how much more time i have to wait for my mobile to ON ?
I'm having this EXACT same problem with my iPhone 4, and I have the same computer stats (I have a Samsung Series 7)
-
I ried to reset my iphone its taking more than 12 hrs still going on,
i tried to reset my iphone its taking more than 12 hrs still going on,How much ususally it takes to rest the i-phone.
yes it is quite normal, simply dont worry.
-
Application is taking More than 1/2 Minute to Populate the data in the Form
Senario
1. I have a Form where a 2 buttons is there to populate data.
2. When 1st button is clicked , it is going to do an EXECUTE_QUERY on a Block whose total data is in the count of 2lakh records and it is faster.
3. When I click on the 2nd button, it will clear the Block and it will Do an EXECUTE_QUERY on another Block whose total data is in the count of 3Lakh Records. But it is taking more than 1/2 Minute to do so.
Any Way to Improve Performance.Forms was never designed to handle 300k records in a
speedy manner. Its a testament to it's robustness
that it works at all.
Who wants to look at more than a few 100 records in a
block? I would suggest that you might need to rethink
your design.I vouch for that.
Jan ins 100% right, nevertheless I have seen Forms that handles thousands of records in a single query perfectly, slow in performance but it never crashes.
Unless you provide a filtering solution or specify your query array size, you can do nothing but wait till all the records are retrieved.
Regards,
Tony -
Can a master data will have loading from more than one data source
Dear gurus,
I have a master data object which has every day loading from one particular data source. Now I got a requirement to load data from another data source. If we load data from more than one data source , will there be any problem with data? Will it accept loading from more than one data source.?
Regards
RsHi Ram,
We can load the data to target morethan one datasource.Thats advantage in BI compare to BW. It will accept loading from more than one datasource.but you have to take care about characteristic and attributes are the same in both datasources like that.
Regards,
SVS -
Hi Friends,
SELECT
DATEPART(YEAR, SaleDate) AS [PrevYear],
DATENAME(MONTH, SaleDate) AS [PrevMonth],
SaleDate as SaleDate,
Sum(Amount) as PrevAmount
FROM TableA A
WHERE SaleDate >= DATEADD(yy, DATEDIFF(yy, 0, GETDATE()) - 1, 0)
AND SaleDate <= DATEADD(dd, -1, DATEADD(yy, DATEDIFF(yy, 0, GETDATE()), 0))
-----'2013-12-31 00:00:00.000'
GROUP BY
SaleDate
This Query taking more than 2 min to pull the results .... basically I was passing last year first date and last date (should derive based on getdate())
if I pass static values like this WHERE SaleDate >= ''2013-01-01 00:00:00.000''
AND SaleDate <= '2013-12-31 00:00:00.000'
then it is pulling results in fraction of seconds.....
Note: I was keeping this code in View I have to use only View ( I know we can write store procedure for this but I dont want sp I need only View)
any idea please how to improve my view performance?
Thanks,
RKDo you have an index on SaleDate column ? If so , is this NCI or CI? How much data does it return? Can you show an execution plan of the query?
Best Regards,Uri Dimant SQL Server MVP,
http://sqlblog.com/blogs/uri_dimant/
MS SQL optimization: MS SQL Development and Optimization
MS SQL Consulting:
Large scale of database and data cleansing
Remote DBA Services:
Improves MS SQL Database Performance
SQL Server Integration Services:
Business Intelligence -
I am extracting the data from ECC To bw .but Data Loading taking long tim
Hi All,
i am extracting the data from ECC To BI Syatem..but Data Loading Taking Long time. from last 6 hoursinfopackage is running.still it is showing yellow.Manually i made the red.and delete again i applied repeat of the last delta.but same proble is coming .in the status job is showing bckground job is not finished at source system.we requested to basis.basis people killed that job.again we schedule the chain also again same problem is coming.how can i solve this issue.
Thanks ,
chanduHi,
There are different places to track your job. Once your job is triggered in BW, you can track your load job where exactly it is taking more time and why. Follow below steps:
1) After InfoPackage is triggered, then take the request number and go to source system to check your extraction job status.
You can get the job status by taking the request number from BW and go to transaction SM37 in ECC. Then give the request number with begining '' and ending ''. Also give '*' to user name.
Job name: REQ_XXXXXX
User Name: *
Check the job status whether job is completed or cancelled or short dump. If the job is still running check in SM66 whether you can see any process. If not accordingly you got to check in ST22 or SM21 in ECC. If the job is complete, then the same in BW side now.
2) Check the data arrived in PSA, if not check whether Transfer routines or start routines are having bad SQL or code. Similarly in update rules.
3) Once it is through in Source system (ECC), Transfer rules , Update Rules, then the next task is updating the data might some time take more time which might be based on some parameters ( Number of parallel process to update database ). Check whether updating the database is taking more time and may be you got to check with the DBA guy also.
At all the times you should see minimum of atleast once process running all the time in SM66 till the time your job gets complete. If not you will see a log in ST22.
Let me know if you still have questions.
Assigning points is the only way of saying thanks in SDN.
Thanks,
Kumar. -
I red that Oracle Data Integrator provides more than 100 KMs out-of-the-box
I red that
Oracle Data Integrator provides more than 100 KMs out-of-the-box.
Is anybody have any idea how I can reach or view it or use it ?I got it its under <Oraclehome>oracledi>impexp
-
Sync and Create project operation from DTR is taking more than one hour
Hi All.
Recently basis team has implemented the track for ESS/MSS application.So When we import the track to NWDS its showing 500 Dcs.
I have successfully done the Sync and create project operation from DTR for 150 DCS and its take 5 min per Dcs.
However after that when i am trying to sync Dc or create project DC from DTR the operation is taking more than 3 hour per DC.Which should not be the case because for rest 150 DC that i ahve done Sync operation adn Create project operation from DTR it hardly takes 5 min per Dc.As this operataion is taking so much time finally i have close the NWDS to stop this operation.
I am using NWDS 2.0.15 and EP7.0 portal SP15 and NWDI is 7.0
Can any body tell how to solve this issue so that i can Sync and Create project from DTR for a DC within 5 min?
Thanks
SusmitaHi Susmita,
If the DCs are fine in CBS build, then I feel there is no need to test all of them locally in NWDS.
You can verify some certain applications in these DCs, then you sync & create project for these DCs & test run these applications.
As I get you only need to check ( no changes will be done ), yes you can verify them in small groups (say 20-25 DCs/group) in different workspaces so that no workspace is overloaded.
But why do you want to keep a copy of them locally as you are not making any changes, you can Unsync & Remove these projects once verified & use the same workspace to work on the next set of DCs.
Hope this clarifies your concerns.
Kind Regards,
Nitin
Edited by: Nitin Jain on Apr 23, 2009 1:55 PM -
Query taking more than 1/2 hour for 80 million rows in fact table
Hi All,
I am stuck in this query as it it taking more than 35 mins to execute for 80 million rows. My SLA is less than 30 mins for 160 million rows i.e. double the number.
Below is the query and the Execution Plan.
SELECT txn_id AS txn_id,
acntng_entry_src AS txn_src,
f.hrarchy_dmn_id AS hrarchy_dmn_id,
f.prduct_dmn_id AS prduct_dmn_id,
f.pstng_crncy_id AS pstng_crncy_id,
f.acntng_entry_typ AS acntng_entry_typ,
MIN (d.date_value) AS min_val_dt,
GREATEST (MAX (d.date_value),
LEAST ('07-Feb-2009', d.fin_year_end_dt))
AS max_val_dt
FROM Position_Fact f, Date_Dimension d
WHERE f.val_dt_dmn_id = d.date_dmn_id
GROUP BY txn_id,
acntng_entry_src,
f.hrarchy_dmn_id,
f.prduct_dmn_id,
f.pstng_crncy_id,
f.acntng_entry_typ,
d.fin_year_end_dt
Execution Plan is as:
11 HASH JOIN Cost: 914,089 Bytes: 3,698,035,872 Cardinality: 77,042,414
9 TABLE ACCESS FULL TABLE Date_Dimension Cost: 29 Bytes: 94,960 Cardinality: 4,748
10 TABLE ACCESS FULL TABLE Position_Fact Cost: 913,693 Bytes: 2,157,187,592 Cardinality: 77,042,414
Kindly suggest, how to make it faster.
Regards,
SidThe above is just a part of the query that is taking the maximum time.
Kindly find the entire query and the plan as follows:
WITH MIN_MX_DT
AS
( SELECT
TXN_ID AS TXN_ID,
ACNTNG_ENTRY_SRC AS TXN_SRC,
F.HRARCHY_DMN_ID AS HRARCHY_DMN_ID,
F.PRDUCT_DMN_ID AS PRDUCT_DMN_ID,
F.PSTNG_CRNCY_ID AS PSTNG_CRNCY_ID,
F.ACNTNG_ENTRY_TYP AS ACNTNG_ENTRY_TYP,
MIN (D.DATE_VALUE) AS MIN_VAL_DT,
GREATEST (MAX (D.DATE_VALUE), LEAST (:B1, D.FIN_YEAR_END_DT))
AS MAX_VAL_DT
FROM
proj_PSTNG_FCT F, proj_DATE_DMN D
WHERE
F.VAL_DT_DMN_ID = D.DATE_DMN_ID
GROUP BY
TXN_ID,
ACNTNG_ENTRY_SRC,
F.HRARCHY_DMN_ID,
F.PRDUCT_DMN_ID,
F.PSTNG_CRNCY_ID,
F.ACNTNG_ENTRY_TYP,
D.FIN_YEAR_END_DT),
SLCT_RCRDS
AS (
SELECT
M.TXN_ID,
M.TXN_SRC,
M.HRARCHY_DMN_ID,
M.PRDUCT_DMN_ID,
M.PSTNG_CRNCY_ID,
M.ACNTNG_ENTRY_TYP,
D.DATE_VALUE AS VAL_DT,
D.DATE_DMN_ID,
D.FIN_WEEK_NUM AS FIN_WEEK_NUM,
D.FIN_YEAR_STRT AS FIN_YEAR_STRT,
D.FIN_YEAR_END AS FIN_YEAR_END
FROM
MIN_MX_DT M, proj_DATE_DMN D
WHERE
D.HOLIDAY_IND = 0
AND D.DATE_VALUE >= MIN_VAL_DT
AND D.DATE_VALUE <= MAX_VAL_DT),
DLY_HDRS
AS (
SELECT
S.TXN_ID AS TXN_ID,
S.TXN_SRC AS TXN_SRC,
S.DATE_DMN_ID AS VAL_DT_DMN_ID,
S.HRARCHY_DMN_ID AS HRARCHY_DMN_ID,
S.PRDUCT_DMN_ID AS PRDUCT_DMN_ID,
S.PSTNG_CRNCY_ID AS PSTNG_CRNCY_ID,
SUM
DECODE
PNL_TYP_NM,
:B5, DECODE (NVL (F.PSTNG_TYP, :B2),
:B2, NVL (F.PSTNG_AMNT, 0) * (-1),
NVL (F.PSTNG_AMNT, 0)),
0))
AS MTM_AMT,
NVL (
LAG (
SUM (
DECODE (
PNL_TYP_NM,
:B5, DECODE (NVL (F.PSTNG_TYP, :B2),
:B2, NVL (F.PSTNG_AMNT, 0) * (-1),
NVL (F.PSTNG_AMNT, 0)),
0)))
OVER (
PARTITION BY S.TXN_ID,
S.TXN_SRC,
S.HRARCHY_DMN_ID,
S.PRDUCT_DMN_ID,
S.PSTNG_CRNCY_ID
ORDER BY S.VAL_DT),
0)
AS YSTDY_MTM,
SUM (
DECODE (
PNL_TYP_NM,
:B4, DECODE (NVL (F.PSTNG_TYP, :B2),
:B2, NVL (F.PSTNG_AMNT, 0) * (-1),
NVL (F.PSTNG_AMNT, 0)),
0))
AS CASH_AMT,
SUM (
DECODE (
PNL_TYP_NM,
:B3, DECODE (NVL (F.PSTNG_TYP, :B2),
:B2, NVL (F.PSTNG_AMNT, 0) * (-1),
NVL (F.PSTNG_AMNT, 0)),
0))
AS PAY_REC_AMT,
S.VAL_DT,
S.FIN_WEEK_NUM,
S.FIN_YEAR_STRT,
S.FIN_YEAR_END,
NVL (TRUNC (F.REVSN_DT), S.VAL_DT) AS REVSN_DT,
S.ACNTNG_ENTRY_TYP AS ACNTNG_ENTRY_TYP
FROM
SLCT_RCRDS S,
proj_PSTNG_FCT F,
proj_ACNT_DMN AD,
proj_PNL_TYP_DMN PTD
WHERE
S.TXN_ID = F.TXN_ID(+)
AND S.TXN_SRC = F.ACNTNG_ENTRY_SRC(+)
AND S.HRARCHY_DMN_ID = F.HRARCHY_DMN_ID(+)
AND S.PRDUCT_DMN_ID = F.PRDUCT_DMN_ID(+)
AND S.PSTNG_CRNCY_ID = F.PSTNG_CRNCY_ID(+)
AND S.DATE_DMN_ID = F.VAL_DT_DMN_ID(+)
AND S.ACNTNG_ENTRY_TYP = F.ACNTNG_ENTRY_TYP(+)
AND SUBSTR (AD.ACNT_NUM, 0, 1) IN (1, 2, 3)
AND NVL (F.ACNT_DMN_ID, 1) = AD.ACNT_DMN_ID
AND NVL (F.PNL_TYP_DMN_ID, 1) = PTD.PNL_TYP_DMN_ID
GROUP BY
S.TXN_ID,
S.TXN_SRC,
S.DATE_DMN_ID,
S.HRARCHY_DMN_ID,
S.PRDUCT_DMN_ID,
S.PSTNG_CRNCY_ID,
S.VAL_DT,
S.FIN_WEEK_NUM,
S.FIN_YEAR_STRT,
S.FIN_YEAR_END,
TRUNC (F.REVSN_DT),
S.ACNTNG_ENTRY_TYP,
F.TXN_ID)
SELECT
D.TXN_ID,
D.VAL_DT_DMN_ID,
D.REVSN_DT,
D.TXN_SRC,
D.HRARCHY_DMN_ID,
D.PRDUCT_DMN_ID,
D.PSTNG_CRNCY_ID,
D.YSTDY_MTM,
D.MTM_AMT,
D.CASH_AMT,
D.PAY_REC_AMT,
MTM_AMT + CASH_AMT + PAY_REC_AMT AS DLY_PNL,
SUM (
MTM_AMT + CASH_AMT + PAY_REC_AMT)
OVER (
PARTITION BY D.TXN_ID,
D.TXN_SRC,
D.HRARCHY_DMN_ID,
D.PRDUCT_DMN_ID,
D.PSTNG_CRNCY_ID,
D.FIN_WEEK_NUM || D.FIN_YEAR_STRT || D.FIN_YEAR_END
ORDER BY D.VAL_DT)
AS WTD_PNL,
SUM (
MTM_AMT + CASH_AMT + PAY_REC_AMT)
OVER (
PARTITION BY D.TXN_ID,
D.TXN_SRC,
D.HRARCHY_DMN_ID,
D.PRDUCT_DMN_ID,
D.PSTNG_CRNCY_ID,
D.FIN_YEAR_STRT || D.FIN_YEAR_END
ORDER BY D.VAL_DT)
AS YTD_PNL,
D.ACNTNG_ENTRY_TYP AS ACNTNG_PSTNG_TYP,
'EOD ETL' AS CRTD_BY,
SYSTIMESTAMP AS CRTN_DT,
NULL AS MDFD_BY,
NULL AS MDFCTN_DT
FROM
DLY_HDRS D
Plan
SELECT STATEMENT ALL_ROWSCost: 11,950,256 Bytes: 3,369,680,886 Cardinality: 7,854,734
25 WINDOW SORT Cost: 11,950,256 Bytes: 3,369,680,886 Cardinality: 7,854,734
24 WINDOW SORT Cost: 11,950,256 Bytes: 3,369,680,886 Cardinality: 7,854,734
23 VIEW Cost: 10,519,225 Bytes: 3,369,680,886 Cardinality: 7,854,734
22 WINDOW BUFFER Cost: 10,519,225 Bytes: 997,551,218 Cardinality: 7,854,734
21 SORT GROUP BY Cost: 10,519,225 Bytes: 997,551,218 Cardinality: 7,854,734
20 HASH JOIN Cost: 10,296,285 Bytes: 997,551,218 Cardinality: 7,854,734
1 TABLE ACCESS FULL TABLE proj_PNL_TYP_DMN Cost: 3 Bytes: 45 Cardinality: 5
19 HASH JOIN Cost: 10,296,173 Bytes: 2,695,349,628 Cardinality: 22,841,946
5 VIEW VIEW index$_join$_007 Cost: 3 Bytes: 84 Cardinality: 7
4 HASH JOIN
2 INDEX FAST FULL SCAN INDEX (UNIQUE) proj_ACNT_DMN_PK Cost: 1 Bytes: 84 Cardinality: 7
3 INDEX FAST FULL SCAN INDEX (UNIQUE) proj_ACNT_DMN_UNQ Cost: 1 Bytes: 84 Cardinality: 7
18 HASH JOIN RIGHT OUTER Cost: 10,293,077 Bytes: 68,925,225,244 Cardinality: 650,237,974
6 TABLE ACCESS FULL TABLE proj_PSTNG_FCT Cost: 913,986 Bytes: 4,545,502,426 Cardinality: 77,042,414
17 VIEW Cost: 7,300,017 Bytes: 30,561,184,778 Cardinality: 650,237,974
16 MERGE JOIN Cost: 7,300,017 Bytes: 230,184,242,796 Cardinality: 650,237,974
8 SORT JOIN Cost: 30 Bytes: 87,776 Cardinality: 3,376
7 TABLE ACCESS FULL TABLE proj_DATE_DMN Cost: 29 Bytes: 87,776 Cardinality: 3,376
15 FILTER
14 SORT JOIN Cost: 7,238,488 Bytes: 25,269,911,792 Cardinality: 77,042,414
13 VIEW Cost: 1,835,219 Bytes: 25,269,911,792 Cardinality: 77,042,414
12 SORT GROUP BY Cost: 1,835,219 Bytes: 3,698,035,872 Cardinality: 77,042,414
11 HASH JOIN Cost: 914,089 Bytes: 3,698,035,872 Cardinality: 77,042,414
9 TABLE ACCESS FULL TABLE proj_DATE_DMN Cost: 29 Bytes: 94,960 Cardinality: 4,748
10 TABLE ACCESS FULL TABLE proj_PSTNG_FCT Cost: 913,693 Bytes: 2,157,187,592 Cardinality: 77,042,414 -
XMLSerialize taking more than 10 hours to execute in Oracle
Hi All,
In my current project, converting oracle query into xml format first and then using XMLSerialize for printing but while execution it is taking more than 15 hours and more.
The basic oracle query taking hardly 10 second to execute and converting from oracle query to xml format taking around 1 min but when i am using XMLSerialize with order by clause it is not executing.
Can some help for fixing this performance issue causing due to XMLSerialize
Thanks in advance.
after adding the below clause performance issue started
select XMLSerialize(CONTENT rec_str as CLOB) as test_XML, 100 + rnum as ORDER_CLAUSE from xxtemp
Edited by: redrose1405 on May 1, 2012 12:45 AMHow much free space do you have on your boot drive?
OT -
Cancelling disc burn taking more than half-an-hour
I had just burned a cd before trying to burn another one but that wouldn't finish. The one before it took less than two minutes while this next one was taking more than ten. I decided to cancel it but now it is still on the cancelling sequence and I cannot get the cd out. Any advice?
Given that it's so new, and what you've already tried, it sounds like an issue that requires service.
You can arrange online service here.
Service request. -
Hello,
I know that there is an issue in BPEL loading files more than 7 mb. By increasing the JVM memory (using the mx parameter) we can bypass that (and it works fine). The problem is we loose the execution trace of the BPEL process in the management console. Is there any workaround for that ?
Thanks for you input / help on that issue.By the way, since it is very large XML files with a lot of children nodes, we cannot split the XML file... ;-)
Message was edited by:
emarcoux -
While using Status for Object button it is taking more than 15 mins to open
Hi Gurus,
We are trying to attach documents to ZBOS & OR types sales documents , while opening the Status for Object button of the sales order it is taking more than 15 mins to open , once it is opened it is working as normal.
can you please let us know is it the system functionality because of which it is taking so much time to open or the problem is with something else.
please let us also know whether it is system impacting process.
We are using 4.6C.
Thank You,
Boyeni.Hi Syed ,
Greetings!!!...
Thank you very much for your Swift response!.
could you be so kind to let me know The Program that needs to be refreshed.
Thank You once again for your Assistance.
Boyeni.
Maybe you are looking for
-
I use the AiO Remote app on IOS to scan legal size documents. I think my problem is with the latest update of this app as I never had this issue in the past. I scan the multi-page LEGAL size doc with no issues. I see the doc pages in legal size fo
-
I am trying to restore my old Macbook Pro data using Time Machine to a new Macbook Air from an external hard drive, but when I restore I get two diferrent loggin accounts and I only wnat one loggin account to access all my data/files?
-
Paging File in windows for SAP
Dear Experts , I am new to SAP R/3 Basis .I am very much in need to set paging file size to 1GBram + 8GB. but i am unable to increase virtual memory beyond 4096 initial and 4096 max. Please any one let me know the step by ste
-
IMac 2012 (December) more RAM or faster processor?
I will buy a 21.5 iMac. Basic user: web, MS Office documents, some photo editing, burn DVDs. Maybe light video. I tend to keep my computers a long time (current Mac is the half dome from about 8 years ago). Question: for the most bang for the buck
-
Error Redeploying worklistapp on the SOA Suite 10.1.3.4.0
Hi There, I am trying to deploy the worklistapp given the bpel samples to the localhost. I am getting the below error when deploying it: deploy.oc4j: [java] Error: Missing ormi[s]://<host>:<port> [java] Java Result: 1 BUILD SUCCESSFUL I redeployed th