Data load management and CTS
hi sdn,
can any one explain data load management and CTS
regards
andrea
Hi Dipika,
When ever the delta fails, we have to do the repeat delta to bring earlier delta records along with current delta.
If your data load fails in ods it won't have any impact on your cube, manually turn the status of the request to red and delete the failed request from the target trigger repeat delta.
suppose if the datamart load fails in between this ods to cube, then analyze the reason for the failure, if the dso is updating a single cube then you can follow the same step, if this is updating multiple targets Reset the datamart status in the ods for that request after deleting the bad request from the target and trigger again, if it fails only in a single target then you can load the same delta by creating seperate info packge and check the target to be loaded for that instance.
Thanks,
Sathish.
Similar Messages
-
Hi,
I am getting different record count for Full and Init GL - Data loads.
Count for Init is about 300 records less than the count for Full load.
What could be the reason ?
Thanks,while posting question be clear what cube/datasource ,which GL OLD or NEW you are working whats the background...else its just speculating and beating around bush guessing..
http://help.sap.com/saphelp_erp2005vp/helpdata/en/45/757140723d990ae10000000a155106/content.htm
New Gl data flow-
Re: New GL cubes 0FIGL_V10, 0FIGL_V11
Hope it Helps
Chetan
@CP.. -
Data Load file and Rule file Creation
Hi,
I have used to create Rule file and Data file for loading data into Essbase 7.1.6 version.
Past two years I didnt work in Essbase and forget the option, field properities, data load file creation.
Could you please advice me or any demo for creating rule file as well data files?.Two things I could suggest.
1. look at the Sample.basic application it has dimension and data load rules for all sorts of scenarios
2. Come to my session at Kaleidoscope Rules files beginning to advanced where I go over some of the more interesting things with rules files -
Data Loading(Before and After Image)
I heard that a datasource which has both "Before and After Image " ,then the data can be sent directly to the infocube or from the DSO to the Infocube but where as
If a datasource supports after image then first it has to be loaded to the DSO and then to the Infocube ,
My question is how to know the image types of the datasource ?Hi Ravi,
Check in ROOSOURCE tables in ECC. You can find the behvariaour options in DELTA field, so based on this table we can say will it support Cube/ODS.
Delta Only Via Full Upload (ODS or InfoPackage Selection)
A ALE Update Pointer (Master Data)
ABR Complete Delta with Deletion Flag Via Delta Queue(Cube-Comp)
ABR1 Like Method 'ABR' But Serialization Only by Requests
ADD Additive Extraction Via Extracto (e.g. LIS Info Structures)
ADDD Like 'ADD' But Via Delta Queue (Cube-Compatible)
AIE After-Images Via Extractor (FI-GL/AP/AR)
AIED After-Images with Deletion Flag Via Extractor (FI-GL/AP/AR)
AIM After-Images Via Delta Queue (e.g. FI-AP/AR)
AIMD After-Images with Deletion Flag Via Delta Queue (e.g. BtB)
CUBE InfoCube Extraction
D Unspecific Delta Via Delta Queue (Not ODS-Compatible)
E Unspecific Delta Via Extractor (Not ODS-Compatible)
FIL0 Delta Via File Import with After-Images
FIL1 Delta Via File Import with Delta Images
NEWD Only New Records (Inserts) Via Delta Queue (Cube-Compatible)
NEWE Only New Records (Inserts) Via Extractor (Cube-Compatible)
O
ODS ODS Extraction
X Delta Unspecified (Do Not Use!) -
Data load - Customer and quotation
Hi
I need to load customer data from the legacy into SAP using LSMW.. I found the standard program RFBIDE00 for this. Will this program help me load the data for all account groups ?
My 2nd question is - Is there any standard prog/BAPI or IDoc to load Quotations(VA21) into SAP ? How will this be accomplished ?
Thanks1) You enter the account group in BKN00-KTOKT, so I don't see any restriction on the account group.
2) See if:
BAPI_QUOTATION_CREATEFROMDATA Customer quotation: Create customer quotation
BAPI_QUOTATION_CREATEFROMDATA2 Customer Quotation: Create Customer Quotation
is what you want.
Rob -
Two issues: activation of transfer rules and data load performance
hi,
I have two problems I face very often and would like to get some more info on that topics:
1. Transfer rules activation. I just finished transport my cubes, ETL etc. on productive system and start filling cubes with data. Very often during data load it occurs that transfer rules need to be activated even if I transport them active and (I think) did not do anything after transportation. Then I again create transfer rules transports on dev, transport changes on prod and have to execute data load again.
It is very annoying. What do you suggest to do with this problem? Activate all transfer rules again before executing process chain?
2. Differences between dev and prod systems in data load time.
On dev system (copy of production made about 8 months ago) I have checked how long it takes me to extract data from source system and it was about 0,5h for 50000 records but when I executed load on production it was 2h for 200000 records, so it was twice slower than dev!
I thought it will be at least so fast as dev system. What can influence on data load performance and how I can predict it?
Regards,
AndrzejAksik
1 How freequently this activation problem occurs. If it is one time replicate the datasource and activate thetransfer structure( But in general as you know activation of transfer structure should be done automatically after transport of the object)
2 One thing for difference of time is environmental as you know in production system so many jobs will run at the same time so obiously system performance will be slow compare to Dev System. In your case both the systems are performing equally. You said in dev system for 50000 records half an hour and in production 200000 records 2hrs so records are more in Production system and it took longer time. If it is really causing problem then you have to do some performance activities.
Hope this helps
Thnaks
Sat -
Error 8 when starting the extracting the program-data load error:status 51
Dear all,
<b>I am facing a data exracton problem while extracting data from SAP source system (Development Client 220). </b>The scenario and related setting are as the flowings:
A. Setting:
We have created 2 source system one for the development and another for the quality in BW development client
1. BW server: SAP NetWeaver 2004s BI 7
PI_BASIS: 2005_1_700 Level: 12
SAP_BW: 700 Level:13
Source system (Development Client 220)
2. SAP ERP: SAP ERP Central Component 6.0
PI_BASIS: 2005_1_700 Level: 12
OS: SunOS
Source system (Quality Client 300)
2. SAP ERP: SAP ERP Central Component 6.0
PI_BASIS: 2005_1_700 Level: 12
OS: HP-UX
B. The scenario:
I was abele to load the Info provider from the Source system (Development Client 220), late we create another Source system (Quality Client 300) and abele to load the Info provider from that,
After creating the another Source system (Quality Client 300), initially I abele to load the info provider from both the Source system , but now I am unable to load the Info provider from the (Development Client 220),
Source system Creation:
For both 220 and 300, back ground user in source system is same with system type with same authorization (sap_all, sap_new, S_BI-WX_RFC) And user for source system to bw connection is dialog type (S_BI-WX_RFC, S_BI-WHM_RFC, SAP_ALL, SAP_NEW)
1: Now while at the Info Package : Start data load immediately and then get the
e-mail sent by user RFCUSER, and the content:
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Service API .
2:Error in the detail tab of the call monitor as under,
<b>bi data upload error:status 51 - error: Error 8 when starting the extracting the program </b>
Extraction (messages): Errors occurred
Error occurred in the data selection
Transfer (IDocs and TRFC): Errors occurred
Request IDoc : Application document not posted (red)
bw side: status 03: "IDoc: 0000000000007088 Status: Data passed to port OK,
IDoc sent to SAP system or external program"
r<b>/3 side: status 51: IDoc: 0000000000012140 Status: Application document not posted
Error 8 when starting the extraction program</b>
Info IDoc 1 : Application document posted (green)
r/3 side: "IDoc: 0000000000012141 Status: Data passed to port OK
IDoc sent to SAP system or external program"
bw side: "IDoc: 0000000000007089 Status: Application document posted,
IDoc was successfully transferred to the monitor updating"
Have attached screen shots showing error at BW side.
check connection is ok, tried to restore the setting for bw-r3 connection though same problem
Have checked partner profile.
<b>what's wrong with the process? </b>
Best regards,
dushyant.Hi,
Refer note 140147.
Regards,
Meiy -
4.2.3/.4 Data load wizard - slow when loading large files
Hi,
I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
While loading data, these configuration leads to the execution of the following query for each row:
select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
which can be found in the v$sql view while loading.
It makes the loading process slow, because of the upper function no index can be used.
It seems that the setting of "case sensitive" is not evaluated.
Dropping the numeric index for the primary key and using a function based index does not help.
Explain plan shows an implicit "to_char" conversion:
UPPER(TO_CHAR(PK)=UPPER(:UK_1)
This is missing in the query but maybe it is necessary for the function based index to work.
Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
Best regards
KlausNevertheless, a bulk loading process is what I really like to have as part of the wizard.
If all of the CSV files are identical:
use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
create a VIEW on the collection (makes it easier elsewhere)
create a procedure (in a Package) to bulk process it.
The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
MK -
Data loader : Import -- creating duplicate records ?
Hi all,
does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
Duplicate Checking Method == External Unique ID
Action Taken if Duplicate Found == Overwrite Existing Records
but data loader have created new records where the "External Unique ID" is already existent..
Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
thanks in advance, Juergen
Edited by: 791265 on 27.08.2010 07:25
Edited by: 791265 on 27.08.2010 07:26Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
You should review all documentation on Oracle Data Loader On Demand before using it.
These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
Pete -
Error installing DPM Agent in Update Rollup 5 for System Center 2012 R2 Data Protection Manager
I have updated my main DPM server to Update Rollup 5 for System Center 2012 R2 Data Protection Manager and all went well.
All Protected servers updated and rebooted and continued protection - EXCEPT one;
Physical Windows Server 2012 R2 Datacenter.
Tried Manual install - no luck
Removed Protection Agent / rebooted / re-installed - no luck
Installed all updates / re-install - no luck
this seems to be the only indicated problem in MSDPMAgentInstall.LOG.....
Property(S): PATCHMediaSrcProp = C:\Windows\Microsoft Data Protection Manager\DPM\ProtectionAgents\RA\4.2.1292.0\DPMProtectionAgent_KB3021791.msp
MSI (s) (04:34) [09:24:23:772]: Product: Microsoft System Center 2012 R2 DPM Protection Agent - Update 'Microsoft System Center 2012 R2 DPM Protection Agent Update - KB3021791' could not be installed. Error code 1603. Additional information is available in
the log file C:\Windows\\Temp\MSDPMAgentInstall.LOG.
MSI (s) (04:34) [09:24:23:772]: Windows Installer installed an update. Product Name: Microsoft System Center 2012 R2 DPM Protection Agent. Product Version: 4.2.1292.0. Product Language: 1033. Manufacturer: Microsoft Corporation. Update Name: Microsoft System
Center 2012 R2 DPM Protection Agent Update - KB3021791. Installation success or error status: 1603.
MSI (s) (04:34) [09:24:23:772]: Note: 1: 1729
MSI (s) (04:34) [09:24:23:772]: Note: 1: 2205 2: 3: Error
MSI (s) (04:34) [09:24:23:772]: Note: 1: 2228 2: 3: Error 4: SELECT `Message` FROM `Error` WHERE `Error` = 1729
MSI (s) (04:34) [09:24:23:772]: Note: 1: 2205 2: 3: Error
MSI (s) (04:34) [09:24:23:772]: Note: 1: 2228 2: 3: Error 4: SELECT `Message` FROM `Error` WHERE `Error` = 1709
MSI (s) (04:34) [09:24:23:772]: Product: Microsoft System Center 2012 R2 DPM Protection Agent -- Configuration failed.
MSI (s) (04:34) [09:24:23:772]: Windows Installer reconfigured the product. Product Name: Microsoft System Center 2012 R2 DPM Protection Agent. Product Version: 4.2.1292.0. Product Language: 1033. Manufacturer: Microsoft Corporation. Reconfiguration success
or error status: 1603.
Any help would be greatly appreciated.Hotfix for known issue with Update Rollup 5 for System Center 2012 R2 Data Protection Manager
http://www.microsoft.com/en-us/download/details.aspx?id=45914&WT.mc_id=rss_alldownloads_all
Have a nice day !!!
DPM 2012 R2: Remove Recovery Points -
Troubleshooting 9.3.1 Data Load
I am having a problem with a data load into Essbase 9.3.1 from a flat, pipe-delimited text file, loading via a load rule. I can see an explicit record in the file but the results on that record are not showing up in the database.
* I made a special one-off file with the singular record in question and the data loads properly and is reflected in the database. The record itself seems to parse properly for load.
* I have searched the entire big file (230Mb) for the same member combination, but only come up with this one record, so it does not appear to be a "last value in wins" issue.
* Most other data (610k+ rows) appears to be loading properly, so the fields, in general, are being properly parsed out in the load rule. Additionally, months of a given item are on separate rows, and other rows of the same item are loading properly and being reflected in the database. As well as other items are being loaded properly in the months where this data loads to, so, it is not a metadata-not-existing issue.
* The load is 100% successful according to the non-existent error file. Also, loading the file interactively results in the file showing up under "loaded successfully" (no errors).
NOTE:
The file's last column does contain item descriptions which may include special characters including periods and quotes and other special characters. The load rule moves the description field to the earlier in the columns, but the file itself has it last.
QUESTION:
Is it possible that the a special character (quote??) in a preceding record is causing the field parsing to include the CR/LF, and therefore the next record, into one record? I keep thinking that if the record seems to fine alone, but is not fine where it sits amongst other records, that it may have to do with preceding or subsequent records.
THOUGHTS??Thanks Glenn. I was too busy looking for explicit members that I neglected thinking through implicit members. I guess I was thinking that implied members don't work if you have a rules file that parses out columns...that a missing member would just error out a record instead of using the last known value. In fact, I thought that (last known value) only worked if you didn't use a load rule.
I would prefer some switch in Essbase that requires keys in all fields in a load rule or allows last known value. -
Process Chain Data load fail /send email is not woking
Hello Experts,
I have created a process chain for daily data load. And created a message variant to send an email if the data load fials, but it's not working. I am not sure what I am missing. Here are the step I did...
1. Right click on the Info Package in the process chain and select Create Message, select errors radio button and then created new variant.
2. In the variant screen, clicked on Maintain Recipient list and enter the Recipient names and mail options like express mail, send copy etc there and then saved.
Didn't select any under REcip type. Not sure whether it's needed.
I greatly appreciate your help.
Thanks
Ram
Message was edited by: Sri Vani
Solved by myself.
Thanks a lot.
Message was edited by: Sri Vanidid you configure the Basis process to send external emails and the automatic output process?
You could see this in tx SCOT
If there´s any record in tree (SAP SERVER)->INT->SMTP
Here.... if is not configured the automatic output you can manually send the emails.....
If this doesn´t works you need to talk to your basis support for the email basis configuration....
Regards -
I am having an issue where the data that drives a tilelist
works correctly when the tile list is not loaded on the first page
of the application. When it is put on a second page in a viewstack
then the tilelist displays correctly when you navigate to it. When
the tilelist is placed in the first page of the application I get
the correct number of items to display in the tilelist but the
information the item renderer is supposed to display, ie a picture,
caption and title, does not. The strange thing is that a Tree
populates correctly given the same situation. Here is the sequence
of events:
// get tree is that data for the tree and get groups is the
data for the tilelist
creationComplete="get_tree.send();get_groups.send();"
<mx:HTTPService showBusyCursor="true" id="get_groups"
url="[some xml doc]" resultFormat="e4x"/>
<mx:XMLListCollection id="myXMlist"
source="{get_groups.lastResult.groups}"/>
<mx:HTTPService showBusyCursor="true" id="get_tree"
url="[some xml doc]" resultFormat="e4x" />
<mx:XMLListCollection id="myTreeXMlist"
source="{get_tree.lastResult.groups}"/>
And then the data provider of the tilelist and tree are set
accordingly. I tried putting moving the data calls from the
creation complete to the initialize event thinking that it would
hit earlier in the process and be done by the time the final
completion came about but that didn't help either. I guess I'm just
at a loss as to why the tree works fine no matter where I put it
but the TileList does not. It's almost like the tree and the
tilelist will sit and wait for the data but the item renderer in
the tilelist will not wait. Which would explain why clicking on the
tile list still produces the correct sequence of events but the
visual component of the tilelist is just not working right. Anyone
have any ideas?Ok, so if ASO value is wrong, then its a data load issue and no point messing around with the BSO app. You are loading two transactions to the exact same intersection. Make sure your data load is set to aggregate values and not overwrite.
-
Spread the Data Loads in a SAP BW System
Gurus,
I want to spread the data loads in our BW system, as a BASIS person how do I identify the jobs if they are full loads or delta loads, our goal is to make the load on the system to be evenly distributed as we see too many data loads starting and running around the same time. Can you suggest a right approach to achieve our goal.
Thanks in advance
SivaHello Siva,
As already mentioned the solution is to include the different steps of the data flow, extraction , ODS activation , rollup etc
in process chains and schedule these chains to run at differet times so that they do not place too much load on the system.
If the problem is specific to data loads on the extraction step then I guess that maybe you see the resource problem on the
source system side, if you don't have the load distribtion switched on in the RFC connection to the source system it is
possible that you can specify that the source system extraxction jobs are executed on a particular application server,
please see the information in the 'Solution' part of the note 147104 and read it carefully.
Best Regards,
Des -
Hi Friends,
I need some help in cube deployment using Essbase studio and oracle as my source. I am trying to deploy a cube, my dimensions are loaded properly but data loading fails and gives following error.
*"Cannot get async process state. Essbase Error(1021001): Failed to Establish Connection With SQL Database Server. See log for more information"*
Kindly advice.
Thanks
Andyused custom data load settings using group by function.
Maybe you are looking for
-
Can i use lens cleaner spray to clean my ipod
Can i use a lens cleaner spray to clean my ipod touch screen?
-
Distributed Document Capture and UCM commit issue: 302 Moved Temporarily
I am installing and configuring Oracle Document Capture and Oracle Distributed Document Capture to commit documents to Oracle UCM. I am unable to commit documents to UCM because of the following error: "302 Moved Temporarily". What can be wrong with
-
Hi, There is any method to stop posting of GR against subcontractor PO, by removing item chekbox tick manually. ie at the time of GR against subcontracting PO, without sending material to vendor, the PO is posted by removing the check box forr child
-
Hi Couple of queries in message versions (in sxmb_moni) Based on a scnerio abap client proxy to XI to abap server proxy :- Does the final satus of indicated in XI represent the status of the processing at the final destination system ? thanks Pras
-
Is it possible to cal web dynpro java application using portal application?
Hi, is it possible to cal web dynpro java application using j2ee portal application? If possible, how can it be done the parameter mapping over context area? Regards.