Problem with update from ODS to Cube
Hi All,
I had an issue, when i was loading from ODS to Cube everything was fine except one data package failed coz of some object locked( User : myself, though i wasn't running anything else except this), but i check in SM12 no locks exist for that particular time.
apparently the load failed coz of this single package, its a load from ODS to Cube, so no chance of Manual update from PSA.
any suggestion: do i have to repeat that load all over again( sucks !!! it is for 40 mil)
Regards,
Robyn.
Hi Hoggard
Check the Job log fro ur Job in the SM37 and check for that Datapackage is there any thing failed liek
"ARFCSTATE-SYSFAIL" if this is there .. it means its an Deadlock .. please check the ST22 dump...
U can use the PSA...
1) go to the 8ODS infosource and copy the Delta IP
2) check the option in the processsing as Update PSA and the subsequently into the data targets
3) Schedule the IP manually
hope it helps
regards
AK
Similar Messages
-
Error in Data updation From ODS TO CUBE
hello Experts
We are working on BW 3.5
for FI GL line items we are loading the data to ODS and than to Cube.
earlier there was a process chain failure 2 months back so delta were stuck for 2 months now about 10 million records have been uploaded in ODS but when i am tryint to upload these records to CUBE via delta upload its giving Error
Value 'Ã
 ABC100711396 ' (hex. 'C30DC2A04C4344313030373131333936 ') of characteristic 0REF_DOC_NO c
i would like to know how to correct this and from where this error is coming as the same data is loaded to ODS and there its OK and secondly as there is no PSA bw ODS and Cube Do i need to delete the request and upload again
what if same error appears again should i write a start routine in update rule Between ODS and CUBE for special characters
thanks for repliesHi,
Goto All Elementary Tests in RSRV
Goto Master data
Select the required test
Select your Infoobject
Execute the test
You will get the result with errors or no errors
If you get errors, then you click "Correct Error" button.
Regrds,
Suman -
Hi,
after the Update from 7.6.00.37 to 7.6.03.15 in our live system we noticed a lot of performance problems. We have tested the new version before on our test system, but there don't noticed effects like this.
We have 2 identical systems (Opteron 64 bit, openSuse 10.2) with log shipping between them. We updated first the standby system, switched from online to standby (with copy of cold log) and started the new server as online system. After that we run a complete backup (runtime: 1 hour) for starting a new backup history and for activating autolog. Then
With the update we changed USE_OPEN_DIRECT to YES, but the performance of the system was very slow afterwards. After the backup it remains at a high load average (> 10, previous system had about 2-4), with nearly 100% of CPU usage for the db kernel process.
Next day we switched USE_OPEN_DIRECT back to NO. The system first runs better, but periodically rises up to a load average of 6 and slow down the performance of various applications (somebody says about 10 times slower). Here we also noticed a high usage (now 200-300%) of the db kernel process.
Our questions are:
1. Has something basically changed from 7.6.00.37 to 7.6.03.15, so that our various applications (JDBC, ODBC and Perl/SQLDBC partially on old linux systems with drivers from 7.5.00.23) don't reach same performance as before?
2. Are there any other (new) parameters, which can help? Maybe reducing MAXCPU from 4 to 3 for reserving capacities for the system (there is only one maxdb instance running)?
3. Is the a possibility to switch back to 7.6.00.37 (only for worst case)?
I have made some first steps with x_cons, but don't see any anomalies on the first look.
Regards,
ThomasThomas Schulz wrote:>
> > > Next day we switched USE_OPEN_DIRECT back to NO. The system first runs better, but
> >
> > What is it about this parameter that lets you think it may be the cause for your problems?
> After changing it back to NO, the system runs better (lower load average) than with YES (but much slower than with old version!)
Hmm... that is really odd. When using USE_OPEN_DIRECT there is actually less work to do for he operating system.
> > > Our questions are
> > >
> > > 1. Has something basically changed from 7.6.00.37 to 7.6.03.15, so that our various applications (JDBC, ODBC and Perl/SQLDBC partially on old linux systems with drivers from 7.5.00.23) don't reach same performance as before?
> >
> > Yes - of course. Changes are what Patches are all about!
>
> Are there any known problems with updating from 7.6.00.37 to 7.6.03.15?
Well of course there are bugs that have been found inbetween the release of both versions, but I am not aware of something like the performance killer.
We will have to check this in detail here.
> > > I have made some first steps with x_cons, but don't see any anomalies on the first look.
> >
> > Ok, looking into what the system does when it uses CPU is a first good step.
> > But what would be "anomalies" to you?
>
> Good question! I don't really know.
Well - then I guess 'looking at the system' won't bring you far...
> > Do you use DBAnalyzer? If not -> activate it!
> > Does it gives you any warnings?
> > What about TIME_MEASUREMENT? Is it activated on the system? If not -> activate it!
>
> OK, that will be our next steps.
Great - let's see the warnings you get.
Let us also see the DB Parameters you set.
> > What parameters have changed due to the patch installations (check the parameter history file)?
> >
> > What queries take longer now? What is the execution plan of them?
>
> It seems to happen for all selects on tables with a lot of rows (>10.000, partially without indexes because automatically generated). With the old version we had no problems with missing indexes or the general performance. Unfortunatly it is very difficult to extract some sql statements out of the JBoss applications. But even simple queries (without any join) runs slower, when the load average rises over 4-5.
Hmm... the question here is still, if the execution plans are good enough to meet your expectations.
E.g. for tables that you access via the primary key it actually doesn't matter how many rows a table has (not for MaxDB at least).
> > BTW: how exactly do the tests look like that you've done on the testsystem?
> Usage over 6 weeks with our JDBC development environment (JBoss), backup and restore with various combinations of USE_OPEN_DIRECT and USE_OPEN_DIRECT_FOR_BACKUP.
Sounds like the I/O is your most suspect aspect for overall system performance...
> > Was the testsystem a 1:1 copy of the productive machine before the upgrade test?
> No - smaller hardware (32 bit), only 20% of data of the live system, few db users and applications.
>
> > How did you test the system performance with multiple parallel users?
> Only while permanent development with the 2-3 developers and some parallel tests of backup/restore. Unfortunately no tests with many users/applications.
Ok - so this is next to no testing at all when it comes to performance.
> An UPDATE STATISTICS over all db users seems to change nothing. At the moment the system remains markedly slow and we are searching for reasons and solutions. Another attempt will be the change of MAXCPU from 4 to 3.
Why do you want to do that? Have you observed any threads that don't get a CPU because all 4 cores are used by the MaxDB kernel?
regards,
Lars -
Updating the last date request from ODS to CUBE
Dear Friends,
Please can someone explain me.
i allways update request from ods to cube, sometime there will be 3 to 4 request in the ods which wont have the Data mart (Tick) and when i update it to the cube. i can only see 1 request which is current date request. i dont know whether the previous days request has been updated to the cube.???
but i see in the ODS that all the data mart (tick) is available for all the request.
And please can someone explain me, if i have many request in the ods, current date previous day and so on.
Is there any way to update in such a way, where i can see all the dates in the Cube instead of only 1 request with current date in the cube.
when i delete the request from the cube and remove the tick from the ods, and refresh, then suddenly the current request and all the previous request TICK will be gone...
Thanks for your help.
will assign complete points.
Thank you so so muchHi,
if you know everday how many request are getting updated in infocube as one request the you can check the added records in infocube .. it should be equal to the sum of all those request ..
but transferred can be more or equal .. also its depends on the
update rules ..designing ..
Hope this helps you ..
Regards,
shikha -
Error in updating data from ODS to CUBE.
Hi,
I am tryin to load data manually from ODS to CUBE in NW2004s.
This is a flat file load from the datasource to the ODS and then from the ODS to the CUBE.
In the CUBE, I am trying to populate fields by using the ODS fields.
For eg.
In the ODS, a CHAR Infoobject has the data in the timestamp format(i.e. mm/dd/yyyy hh:mm ). I need to split this data and assign them to the two individual DATE and TIME Infoobject in the CUBE.
For this, I have done the coding in the Transfer Structure in the Rule Group.
The time field is gettin populated , but the date field is not getting populated.
I get an error as Eg:
<b>Value '04052007' for CHAR 0DATE is not plausible</b>
Due to this, the corresponding records is not getting displayed
Also, the records where the time id displayed, the date is not getting displayed inspite of the date being correct.
Please help me with a solution for this.
<b><u><i>REMOVED</i></u></b>
Thanks In Advance.
Hitesh ShettyHello Hitesh
SAP accepts the date format in YYYYMMDD, so in the routine where you have concatenate the day month year...just do it in reverse order.....
Thanks
Tripple k -
nowadays have many user have problem with update to ios7 and need active with apple id maybe in the future in order escape from these problems must be stop use these products else. Because of simple user don't know about this technology and sometime just hear from other user that it 's difficult to use then force they change phone that use to handle to another.
It is a feature to discourage the theft of iPhones by making them useless if resold. It's not going anywhere. It's simple: just don't buy a phone until you make sure that the activation lock has been disabled.
-
I have a Iphone 4S and i update a new version 6.1, and i have a problem with wifi from i update this version. What is this problem? thanks Andres
Move or Copy the ENTIRE iTunes folder from the old computer or the backup of the old computer to the new computer.
-
Updation of data from ods to cube
hi all,
I have 8 activated request in ODS . now I want to load data into a cube
from this ods .
is it possible to update request one by one so that I have same no. of request
in cube as in ods?
Thanks in adv.Hi
If you want to do like that, you have to create an infopack which updates data from ODS to cube.....Go to datamart ODS in your Infosouces...create an infopack....and you can giv request id in selection and do one by one
Assign points if useful
Thanks
N Ganesh -
Error while loading data from ODS to CUBE.
Hi friends,
When l am loading data from ODS to Cube with help of data mart, I am getting error in QA system , IN DM system ,every thing went well.IF i see the detail tab in monitor under Processing .
Its is showing like this .
Transfer Rules : Missing Massage.
Update PSA : missing massage.
Processing end : missing message.
I have checked the coding in update rules, everything is ok.
Plz any inputs.
hari
Message was edited by:
hari reddyMight means that IDocs flow is not defined properly in your QA system for myself-SourceSystem.
Regards,
Vitaliy -
Push from ODS to cube taking a long time
Hi All,
I've created a new ODS that pushes data to a new cube. I am trying to load 3000 records and its taking over 30 minutes, and I can never get it to complete successfully. When I look in sm50, there is nothing running. The are no short dumps in st22. But when I look in sm21 I get the following:
Documentation for system log message R6 8 :
An error has causes an SAP rollback. All database updates are reset.
Technical details
File................ 009581
Position............ 0000072900
Entry type.......... m ( Error (Function,Module,Row) )
Message ID.......... R6 8
Variable parts...... ThIRollroll bathxxhead1248
My update rules between the ODS and cube are very straightforward. All fields are a direct mapping to the ODS, no routines. I have a start routine, but when I comment out the entire start routine and run the update between ODS and cube, I still get the same issue. We are on version 3.5.
Any suggestions?
Thanks
CharlaHi,
Try "generate export data source" from ODS and load the data. Sometimes there is problem in DDIC tables and structures are not generated properly.
Regards,
Kams -
Error when trying to load data from ODS to CUBE
hi,
Iam getting a short dump when trying to load data from ODS to CUBE. The Run time error is 'TYPELOAD_NEW_VERSION' and the short text is 'A newer version of data type "/BIC/AZODS_CA00" was found than one required.please help me out.Hi,
Check this thread.........Ajeet Singh has given a good solution here.........
Re: Error With Data Load-Getting Canceled Right Away
Also check SAP note: 382480..................for ur reference............
Symptom
A DART extraction job terminates with runtime error TYPELOAD_NEW_VERSION and error message:
Data type "TXW_INDEX" was found in a newer version than required.
The termination occurs in the ABAP/4 program "SAPLTXW2 " in "TXW_SEGMENT_RECORD_EXPORT".
Additional key words
RTXWCF01, LTXW2U01, TXW_INDEX
Cause and prerequisites
This problem seems to happen when several DART extraction jobs are running in parallel, and both jobs access table TXW_INDEX.
Solution
If possible, avoid running DART extractions in parallel.
If you do plan to run such jobs in parallel, please consider the following points:
In the DART Extract configuration, increase the value of the parameter "Maximum memory allocation for index (MB)" if possible. You can estimate reasonable values with the "File size worksheet" utility.
Run parallel DART jobs on different application servers.
As an alternative, please apply note 400195.
It may help u.........
Regards,
Debjani....... -
Date load error From ODS to Cube?
Guru's,
Here is a status of data load and suggest me.
Actually we are loading data from ODS to Cube which is full load.This loading is done thorugh process chain and it is loaded for every half an hour.Every time the earlier load will be deleted and next load will be taken to cube.Two days ago the load had gone wrong.When I tried to see cause for the failure in Monitor details I could see Update missing and processing missing.When I see in process chains log view I got info stating some invalid characterstic had appeared.But when I search in PSA the whole of the data packets are green.If there is any invalid characterstics PSA should show a red status.But it is not showing.Please guide me how could I solve this?
point will be definitely awarded for the proper answers.
Thanks in advance.
vasu.Hi Vasuvasu
Once the edit is done and reloade the sata it wont show u the records in Red..
Hope itz Helps..!
Regards
KISHORE M REDDY
**Winners Don't Do Different things,They Do things Differently...!**
> I dont whink this is with Disk space Y because after
> every full load we are deleting data. -
Delta Load failure from ODS to Cube
Experts,
After CRM upgrade, I had to reinitilized the delta process. In this, I reloaded ODS with first init delta without data and then delta load with new data created in CRM. This worked good so far. After this when I tried to load cube with init. delta without data, no issues. But delta load from ODS to Cube doesn't work. Does anybody have any suggestions, please?
Thanks,
Nimesh
Following error observed in Status for delta load from ODS to Cube.
Error when transferring data; communication error when analyzing
Diagnosis
Data Packets or Info Packets are missing in BW, but there were - as far as can be seen - no processing errors in the source system. It is
therefore probable that an error arose in the data transfer.
With the analysis an attempt was made to read the ALE outbox of the source system, which lead to error .
It is possible that no connection exists to the source system.
Procedure
Check the TRFC overview in the source system.
Check the connection of the source system for errors and check the
authorizations and profiles of the remote user in both the BW and
source systems.
Check th ALE outbox of the source system for IDocs that have not beenHi,
As far as i understood, you have successful deltas' loading to the ODS and you want to update entire ODS data to the Cube followed by daily delta's.
If this is the case,
Make sure that you have active update rules exist between ODS and Cube.
First goto your ODS and right click and select option 'Update ODS data in data target'.
Select Init update > IT will take you to the init info package where you select init <b>with data transfer</b> (Init with data transfer will bring all the records from ODS to Cube)
Once init is completed ,you can schedule to load delta's regularly form ODS to Cube by following the same steps.
Hope this helps
Praveen
Message was edited by:
Praveen Vujjini -
Delta records not updating from DSO to CUBE in BI 7
Hi Experts,
Delta records not updating from DSO to CUBE
in DSO keyfigure value showing '0' but in CUBE same record showing '-I '
I cheked in Change log table in DSO its have 5 records
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - -1
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0
ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0
ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1
ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0
but active data table have one record - 0
how to corrcct the delta load??
Regards,
JaiHi,
I think initially the value was 0 (ODSR_4LH8CXKUJPW2JDS0LC775N4MH - 0, new image in changelog) and this got loaded to the cube.
Then the value got changed to 1 (ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 0, before image & ODSR_4LIF02ZV32F1M85DXHUCSH0DL - 1, after image). Now this record updates the cube with value 1. The cube has 2 records, one with 0 value and the other with 1.
The value got changed again to 0 (ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - (-1), before image &
ODSR_4LKIX7QHZX0VQB9IDR9MVQ65M - 0, after image). Now these records get aggregated and update the cube with (-1).
The cube has 3 records, with 0, 1 and -1 values....the effective total is 0 which is correct.
Is this not what you see in the cube? were the earlier req deleted from the cube? -
Problem with updating the PC Updater software
Good evening (from Germany). I have a problem with updating my Nokia Software Updater. I can download the setup and I can execute this. But when the installation starts (0%) a error message appears. It's a problem with Windows Installer but I don't know how to solve the problem. Here is a picture with the exactly error message:
http://img-up.net/?up=Nokia_probES8HJXXg.jpg
I hope somebody knows the anser with this software...
Greetings
OS: Windows XP SP2
Phone: Nokia 3110 classic Firmware: V 5.50
Message Edited by dernamenlose on 24-Apr-2008 08:56 PMGood morning (from Denmark).
I hve the same problem.
OS: WinXP SP2
Phone: N73.
Greetings.
Maybe you are looking for
-
Phone not showing up in WMP after update
After updating to Android 4.4.2 recently, phone shows up in Computer when connected via USB, but no longer in WMP, which it did perfectly fine before. Any tips anyone?
-
My Itunes account was hacked and my Apple ID was disabled. I followed the instructions from APPLE to backup and restore from Itunes but that did not work. Any suggestions?? Thanks!!
-
CSS Inbound and outbound confirmation
Hi Gilles If I NAT the private IP addresses of my backend servers to a public IP address for the servers to access the internet. Can I use the same VIP address for the clients on the internet to access the servers? I suppose that this is possible. Co
-
Unlocking view while activeSync process is running
May I know if there is any fix for the following problem in Identity manager 8.1. I have a problem with view locking in activeSync. If a user is logged in to IdM user interface (and thus lock a user view), and that activeSync receives an update for t
-
How much RAM for Final Cut Studio 2?
I am running a two year old dual G5 with 2G RAM installed. I will be editing a feature length documentary this summer shot on HDV. 2G is the recommened amount of RAM on the FCS2 website but I was wondering if I should install the maximum on my comput