Troubleshooting data load procedure
Hi folks,
I am trying to load data into a model built in cube builder. My data load procedure, which is for a single variable, is failing. Here are details:
Dim structure as follows:
DIM CHANNEL:
SALES_CHANNEL (Output)
SUB_CHANNEL (Input)
DIM BUSINESS_UNIT:
BUS_UNIT (Output)
BDM (Input)
Data is in an MS Access table called FACT_NETSALES, with columns SALES_CHANNEL, SUB_CHANNEL, BUS_UNIT, BDM, and time ranging from Oct 2009 to November 2010.
Data sample:
SALES_CHANNEL SUB_CHANNEL BUS_UNIT BDM Oct-09 Nov-09 Dec-09
CH1 CH1SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB2 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB3 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH3 CH3SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH4 CH4SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB2 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB3 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS2 BUS2BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS3 BUS3BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS3 BUS3BDM2 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS4 BUS4BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS5 BUS5BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS6 BUS6BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS7 BUS7BDM1 999999.99 999999.99 999999.99
Here's the procedure:
CLEAR STATUS
USE INITIAL RETAIN
SET Control DB_Name CLIENTTEST
USE &DB_Name UPDATE
CHE UPD
SET Control App_Periodicity MONTHLY
SET Control App_Period October 2009 - November 2010
SET DATE MDY
SET &App_Periodicity
SET PERIOD &App_Period
SET Control App_FACT_Table FACT_NETSALES
SET Control DW_Link LNK_CLIENT
SELECT VAR KPI1_ACT
SELECT CHANNEL
SELECT BUSINESS_UNIT
ACROSS TIME DOWN CHANNEL, BUSINESS_UNIT, VAR
ACCESS LSLINK
CONNECT &DW_Link
BEGIN
SELECT
SALES_CHANNEL,
SUB_CHANNEL,
BUS_UNIT,
BDM,
DATE
FROM &App_FACT_Table
END
Peek only 10
LSS CREATE CHANNEL = SALES_CHANNEL
READ
END
The error I get is:
[Microsoft][ODBC Microsoft Access Driver] Too few parameters. Expected 1.
SQLSTATE: 07001
SQL System code: -3010
What's wrong with the procedure?
Thanks in advance for your help!
I figured it out. I was selecting two levels of a single dimension in the SQL statement.
Similar Messages
-
What are the CRM Trasaction data Data Sources and Data load Procedure
Hi BI Gurus,
Does anybody provide the CRM Transaction data DataSources names and Load procedure into BI7.0
I know the master Data load procedure from CRM to BI7.0.
if you provide Step-by-Step documents it is more help.
Thanks in Advance,
VenkatHi Venkat
In order to find the transactions want you can login the CRM system and then use the transaction RSA6. There you can expand all the subtrees by left clicking on the first line and then clicking expand the further left button. After that you can easily search any datasource you may want
Hope that helps
Rgds
John -
Customer Data Loading Procedure from Legcy system to TCA
Hi 2 all
Anybody know that how to upload customer data from legacy system to TCA(Trading Community Architecture) ?
Thanks
ZulqarnainHi,
As you are doing a full load from R3 to Bw,first check in RSA3 whether u r able to load the data sucessfully.
You can get error here also.......
Did u do the se up table fill up before the transport or after the transport.If it is Before the Transport then
Delete the Set Up Table and Reload the data,as the Extractor structor wud have got changed after the transport.
In any case ,u should be able to get data in RSA3.
Come back if u the issue still persists.
Rgds
SVU -
Troubleshooting 9.3.1 Data Load
I am having a problem with a data load into Essbase 9.3.1 from a flat, pipe-delimited text file, loading via a load rule. I can see an explicit record in the file but the results on that record are not showing up in the database.
* I made a special one-off file with the singular record in question and the data loads properly and is reflected in the database. The record itself seems to parse properly for load.
* I have searched the entire big file (230Mb) for the same member combination, but only come up with this one record, so it does not appear to be a "last value in wins" issue.
* Most other data (610k+ rows) appears to be loading properly, so the fields, in general, are being properly parsed out in the load rule. Additionally, months of a given item are on separate rows, and other rows of the same item are loading properly and being reflected in the database. As well as other items are being loaded properly in the months where this data loads to, so, it is not a metadata-not-existing issue.
* The load is 100% successful according to the non-existent error file. Also, loading the file interactively results in the file showing up under "loaded successfully" (no errors).
NOTE:
The file's last column does contain item descriptions which may include special characters including periods and quotes and other special characters. The load rule moves the description field to the earlier in the columns, but the file itself has it last.
QUESTION:
Is it possible that the a special character (quote??) in a preceding record is causing the field parsing to include the CR/LF, and therefore the next record, into one record? I keep thinking that if the record seems to fine alone, but is not fine where it sits amongst other records, that it may have to do with preceding or subsequent records.
THOUGHTS??Thanks Glenn. I was too busy looking for explicit members that I neglected thinking through implicit members. I guess I was thinking that implied members don't work if you have a rules file that parses out columns...that a missing member would just error out a record instead of using the last known value. In fact, I thought that (last known value) only worked if you didn't use a load rule.
I would prefer some switch in Essbase that requires keys in all fields in a load rule or allows last known value. -
What procedure to call to reject the Import data load?
Hi All,
When i am importing the data i have an after insert trigger which will validate the data going to be inserted.Now if the validation fails i need to reject the data load.Is there any way to do that means manually i want to throws or return some messages via trigger that the 'validation fails' and want to display in the data load screen as well.
I think somebody really have the solution of the problem.
Thanks in advance.
Cheers,
ROSYHave you tried using
RAISE_APPLICATION_FAILURE(-20123, 'Could Not load data because of ...');
in your trigger? I am not sure how HTML DB will trap and display the error in the page.
Mike -
CALL_FUNCTION_CONFLICT_TYPE Standard Data loading
Hi,
I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
I apologise for the long message below... this is a part of the system log.
For information the routine_0004 is a standard one.
Thanks a lot in advanced!
Cheers.
CALL_FUNCTION_CONFLICT_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
Symptoms. Type conflict when calling a function module
Causes Error in ABAP application program.
The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.
This is probably due to an error in the ABAP program.
A function module was called incorrectly.
Errors analysis
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"
"(FORM)" .
Since the caller of the procedure could not have expected this exception
to occur, the running program was terminated. The reason for the exception is:
The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
The field "RESULT" specified here is a different field type.
How to correct the error.
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
"CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC
"GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"
"ROUTINE_0004"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)". 2. A suitable printout of the system log To obtain this, call the system log through transaction SM21. Limit the time interval to 10 minutes before and 5 minutes after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs, supply the source code.
To do this, select the Editor function "Further Utilities-> Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
The exception must either be prevented, caught within the procedure
"ROUTINE_0004"
"(FORM)", or declared in the procedure's RAISING clause.
To prevent the exception, note the following:
Environment system SAP Release.............. "640"
Operating system......... "SunOS" Release.................. "5.9"
Hardware type............ "sun4u"
Character length......... 8 Bits
Pointer length........... 64 Bits
Work process number...... 2
Short dump setting....... "full"
Database type............ "ORACLE"
Database name............ "BWI"
Database owner........... "SAPTB1"
Character set............ "fr"
SAP kernel............... "640"
Created on............... "Jan 15 2006 21:42:36" Created in............... "SunOS 5.8 Generic_108528-16 sun4u"
Database version......... "OCI_920 "
Patch level.............. "109"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."
SAP database version..... "640"
Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
SAP Release.............. "640"
The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in
"ROUTINE_0004".
The main program was "RSMO1_RSM2 ".
The termination occurred in line 702 of the source code of the (Include)
program "GP420EQ35FHFOCVEBCR6RWPVQBR"
of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in the RAISING clause of the procedure.
The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
672 'ROUTINE_0003' g_s_is-recno
673 rs_c_false rs_c_false g_s_is-recno
674 changing c_abort.
675 catch cx_foev_error_in_function.
676 perform error_message using 'RSAU' 'E' '510'
677 'ROUTINE_0003' g_s_is-recno
678 rs_c_false rs_c_false g_s_is-recno
679 changing c_abort.
680 endtry.
681 endform.
682 ************************************************************************
683 * routine no.: 0004
684 ************************************************************************
685 form routine_0004
686 changing
687 result type g_s_hashed_cube-FISCPER3
688 returncode like sy-subrc
689 c_t_idocstate type rsarr_t_idocstate
690 c_subrc like sy-subrc
691 c_abort like sy-subrc. "#EC *
692 data:
693 l_t_rsmondata like rsmonview occurs 0 with header line. "#EC *
694
695 try.
696 * init
variables
697 move-corresponding g_s_is to comm_structure.
698
699 * fill the internal table "MONITOR", to make monitor entries
700
701 * result value of the routine
>>>> CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
703 EXPORTING
704 I_TIMNM_FROM = '0CALDAY'
705 I_TIMNM_TO = '0FISCPER'
706 I_TIMVL = COMM_STRUCTURE-CALDAY
707 I_FISCVARNT = gd_fiscvarnt
708 IMPORTING
709 E_FISCPER = RESULT.
710 * if the returncode is not equal zero, the result will not be updated
711 RETURNCODE = 0.
712 * if abort is not equal zero, the update process will be canceled
713 ABORT = 0.
714
715 catch cx_sy_conversion_error
716 cx_sy_arithmetic_error.
717 perform error_message using 'RSAU' 'E' '507'
718 'ROUTINE_0004' g_s_is-recno
719 rs_c_false rs_c_false g_s_is-recno
720 changing c_abort.
721 catch cx_foev_error_in_function.
System zones content
Name Val.
SY-SUBRC 0
SY-INDEX 2
SY-TABIX 0
SY-DBCNT 0
SY-FDPOS 65
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY 0400
SY-UCOMM OK
SY-TITLE Moniteur - Atelier d'administration
SY-MSGTY E
SY-MSGID RSAU
SY-MSGNO 583
SY-MSGV1 BATVC 0000000000
SY-MSGV2 0PROJECT
SY-MSGV3
SY-MSGV4
Selected variables
Nº 23 Tpe FORM
Name ROUTINE_0004
GD_FISCVARNT
22
00 RS_C_INFO I
4
9
COMM_STRUCTURE-CALDAY
20060303
33333333
20060303
SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR 4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
RESULT
000
333
00You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
DATA: var type <the same that e_fiscper in FM definition>
CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
EXPORTING
I_TIMNM_FROM = '0CALDAY'
I_TIMNM_TO = '0FISCPER'
I_TIMVL = COMM_STRUCTURE-CALDAY
I_FISCVARNT = gd_fiscvarnt
IMPORTING
E_FISCPER = var.
result = var.
--- ASSIGN POINTS IS USEFUL. -
Dear Experts,
If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
This it is the error:
Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
Message no. RSDMD218
Diagnosis
During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
Procedure
u2022 Check if the values of the master data record with the key specified in this message are updated correctly.
u2022 Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
u2022 Re-schedule the master data load process to avoid such situations in future.
u2022 Read SAP note 668466 to get more information about master data update scheduling.
Other hand, the SID table in the master data product is empty.
Thanks for you well!
LuisDear Daya,
Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
We are on BI 7.0 (system ID DXX)
While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
The Master Data tables (S, P, X) are not updated properly.
The following reparing actions have been taken so far:
1. Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
2. Follow instructions from OSS 632931 (tcode RSRV)
3. Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
u201CCharacteristic XXXX00001: error fund during this test.u201D
The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
Could somebody please help us, and let us know how to solve the problem?
Thank for all,
Luis -
Material Master Data Loading Error
Hi All,
I am loading master data for material. i got the following error.
1. 0material_attr :
Record 1 :0MATERIAL : Data record 1 ('000000000000010220 '): Version '000000000000010220 ' is not valid
Process : If this message appears during a data load, maintain the attribute in
the PSA maintenance screens. If this message appears in the master data
maintenance screens, leave the transaction and call it again. This
2.0material_text : Data record 1 ('000000000000010220E '): Version '000000000000010220 ' is not valid
Process : Same as above.
I have applied RSKC tcode in BI with ALL_CAPITAL, ALL_CAPITAL_PLUS_HEX, and one special character string. but still getting the same error.
Regards,
Komik ShahHi All,
Thanks for the reply. i have solved most of errors. but there are some errors like this
Diagnosis
Data record 261 & with the key '000000000000132559 &' is invalid in
value '10x8x5 &' of the attribute/characteristic 0SIZE_DIM &.
System Response
The system has recognized that the value mentioned above is invalid, and
has processed this general error message. A subsequent message may give
you more information on the error. This message refers to the same
value, even though it does not state this explicitly.
Procedure
If this message appears during a data load, maintain the attribute in
the PSA maintenance screens. If this message appears in the master data
maintenance screens, leave the transaction and call it again. This allows to maintain your masterdata
which string i should have to add in RSKC to remove this error ? i have already added the ALL_CAPITAL, ALL_CAPITAL_PLUS_HEX.
I have assigned points.
Regards,
Komik Shah -
How to tune data loading time in BSO using 14 rules files ?
Hello there,
I'm using Hyperion-Essbase-Admin-Services v11.1.1.2 and the BSO Option.
In a nightly process using MAXL i load new data into one Essbase-cube.
In this nightly update process 14 account-members are updated by running 14 rules files one after another.
These rules files connect 14 times by sql-connection to the same oracle database and the same table.
I use this procedure because i cannot load 2 or more data fields using one rules file.
It takes a long time to load up 14 accounts one after other.
Now my Question: How can I minimise this data loading time ?
This is what I found on Oracle Homepage:
What's New
Oracle Essbase V.11.1.1 Release Highlights
Parallel SQL Data Loads- Supports up to 8 rules files via temporary load buffers.
In an Older Thread John said:
As it is version 11 why not use parallel sql loading, you can specify up to 8 load rules to load data in parallel.
Example:
import database AsoSamp.Sample data
connect as TBC identified by 'password'
using multiple rules_file 'rule1','rule2'
to load_buffer_block starting with buffer_id 100
on error write to "error.txt";
But this is for ASO Option only.
Can I use it in my MAXL also for BSO ?? Is there a sample ?
What else is possible to tune up nightly update time ??
Thanks in advance for every tip,
ZeljkoThanks a lot for your support. I’m just a little confused.
I will use an example to illustrate my problem a bit more clearly.
This is the basic table, in my case a view, which is queried by all 14 rules files:
column1 --- column2 --- column3 --- column4 --- ... ---column n
dim 1 --- dim 2 --- dim 3 --- data1 --- data2 --- data3 --- ... --- data 14
Region -- ID --- Product --- sales --- cogs ---- discounts --- ... --- amount
West --- D1 --- Coffee --- 11001 --- 1,322 --- 10789 --- ... --- 548
West --- D2 --- Tea10 --- 12011 --- 1,325 --- 10548 --- ... --- 589
West --- S1 --- Tea10 --- 14115 --- 1,699 --- 10145 --- ... --- 852
West --- C3 --- Tea10 --- 21053 --- 1,588 --- 10998 --- ... --- 981
East ---- S2 --- Coffee --- 15563 --- 1,458 --- 10991 --- ... --- 876
East ---- D1 --- Tea10 --- 15894 --- 1,664 --- 11615 --- ... --- 156
East ---- D3 --- Coffee --- 19689 --- 1,989 --- 15615 --- ... --- 986
East ---- C1 --- Coffee --- 18897 --- 1,988 --- 11898 --- ... --- 256
East ---- C3 --- Tea10 --- 11699 --- 1,328 --- 12156 --- ... --- 9896
Following 3 out of 14 (load-) rules files to load the data columns into the cube:
Rules File1:
dim 1 --- dim 2 --- dim 3 --- sales --- ignore --- ignore --- ... --- ignore
Rules File2:
dim 1 --- dim 2 --- dim 3 --- ignore --- cogs --- ignore --- ... --- ignore
Rules File14:
dim 1 --- dim 2 --- dim 3 --- ignore --- ignore --- ignore --- ... --- amount
Is the upper table design what GlennS mentioned as a "Data" column concept which only allows a single numeric data value ?
In this case I cant tag two or more columns as “Data fields”. I just can tag one column as “Data field”. Other data fields I have to tag as “ignore fields during data load”. Otherwise, when I validate the rules file, an Error occurs “only one field can contain the Data Field attribute”.
Or may I skip this error massage and just try to tag all 14 fields as “Data fields” and “load data” ?
Please advise.
Am I right that the other way is to reconstruct the table/view (and the rules files) like follows to load all of the data in one pass:
dim 0 --- dim 1 --- dim 2 --- dim 3 --- data
Account --- Region -- ID --- Product --- data
sales --- West --- D1 --- Coffee --- 11001
sales --- West --- D2 --- Tea10 --- 12011
sales --- West --- S1 --- Tea10 --- 14115
sales --- West --- C3 --- Tea10 --- 21053
sales --- East ---- S2 --- Coffee --- 15563
sales --- East ---- D1 --- Tea10 --- 15894
sales --- East ---- D3 --- Coffee --- 19689
sales --- East ---- C1 --- Coffee --- 18897
sales --- East ---- C3 --- Tea10 --- 11699
cogs --- West --- D1 --- Coffee --- 1,322
cogs --- West --- D2 --- Tea10 --- 1,325
cogs --- West --- S1 --- Tea10 --- 1,699
cogs --- West --- C3 --- Tea10 --- 1,588
cogs --- East ---- S2 --- Coffee --- 1,458
cogs --- East ---- D1 --- Tea10 --- 1,664
cogs --- East ---- D3 --- Coffee --- 1,989
cogs --- East ---- C1 --- Coffee --- 1,988
cogs --- East ---- C3 --- Tea10 --- 1,328
discounts --- West --- D1 --- Coffee --- 10789
discounts --- West --- D2 --- Tea10 --- 10548
discounts --- West --- S1 --- Tea10 --- 10145
discounts --- West --- C3 --- Tea10 --- 10998
discounts --- East ---- S2 --- Coffee --- 10991
discounts --- East ---- D1 --- Tea10 --- 11615
discounts --- East ---- D3 --- Coffee --- 15615
discounts --- East ---- C1 --- Coffee --- 11898
discounts --- East ---- C3 --- Tea10 --- 12156
amount --- West --- D1 --- Coffee --- 548
amount --- West --- D2 --- Tea10 --- 589
amount --- West --- S1 --- Tea10 --- 852
amount --- West --- C3 --- Tea10 --- 981
amount --- East ---- S2 --- Coffee --- 876
amount --- East ---- D1 --- Tea10 --- 156
amount --- East ---- D3 --- Coffee --- 986
amount --- East ---- C1 --- Coffee --- 256
amount --- East ---- C3 --- Tea10 --- 9896
And the third way is to adjust the essbase.cfg parameters DLTHREADSPREPARE and DLTHREADSWRITE (and DLSINGLETHREADPERSTAGE)
I just want to be sure that I understand your suggestions.
Many thanks for awesome help,
Zeljko -
4.2.3/.4 Data load wizard - slow when loading large files
Hi,
I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
While loading data, these configuration leads to the execution of the following query for each row:
select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
which can be found in the v$sql view while loading.
It makes the loading process slow, because of the upper function no index can be used.
It seems that the setting of "case sensitive" is not evaluated.
Dropping the numeric index for the primary key and using a function based index does not help.
Explain plan shows an implicit "to_char" conversion:
UPPER(TO_CHAR(PK)=UPPER(:UK_1)
This is missing in the query but maybe it is necessary for the function based index to work.
Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
Best regards
KlausNevertheless, a bulk loading process is what I really like to have as part of the wizard.
If all of the CSV files are identical:
use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
create a VIEW on the collection (makes it easier elsewhere)
create a procedure (in a Package) to bulk process it.
The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
MK -
Error while starting data loading on InfoPackage
Hi everybody,
I'm new at SAP BW and I'm working in the "Step-By-Step: From Data Model to the BI Application in the web" document from SAP.
I'm having a problem at the (Chapter 9 in the item c - Starting Data Load Immediately).
If anyone can help me:
Thanks,
Thiago
Below are the copy of the error from my SAP GUI.
<><><><><><><><><><<><><><><><><><><><><><><><><><><><><><><><>
Runtime Errors MESSAGE_TYPE_X
Date and Time 19.01.2009 14:41:22
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Batch - Manager for BW Processes ***********
Long text of error message:
Technical information about the message:
Message class....... "RSBATCH"
Number.............. 000
Variable 1.......... " "
Variable 2.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSBATCH" or "LRSBATCHU01"
"RSBATCH_START_PROCESS"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "sun"
Network address...... "174.16.5.194"
Operating system..... "Windows NT"
Release.............. "5.1"
Hardware type........ "2x Intel 801586"
Character length.... 8 Bits
Pointer length....... 32 Bits
Work process number.. 2
Shortdump setting.... "full"
Database server... "localhost"
Database type..... "ADABAS D"
Database name..... "NSP"
Database user ID.. "SAPNSP"
Terminal.......... "sun"
Char.set.... "English_United State"
SAP kernel....... 701
created (date)... "Jul 16 2008 23:09:09"
create on........ "NT 5.2 3790 Service Pack 1 x86 MS VC++ 14.00"
Database version. "SQLDBC 7.6.4.014 CL 188347 "
Patch level. 7
Patch text.. " "
Database............. "MaxDB 7.6, MaxDB 7.7"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0"
Memory consumption
Roll.... 8112
EM...... 11498256
Heap.... 0
Page.... 65536
MM Used. 6229800
MM Free. 1085264
User and Transaction
Client.............. 001
User................ "THIAGO"
Language key........ "E"
Transaction......... "RSA1 "
Transactions ID..... "CD47E6DDD55EF199B4E6001B782D539C"
Program............. "SAPLRSBATCH"
Screen.............. "SAPLRSS1 2500"
Screen line......... 7
Information on where terminated
Termination occurred in the ABAP program "SAPLRSBATCH" - in
"RSBATCH_START_PROCESS".
The main program was "RSAWBN_START ".
In the source code you have the termination point in line 340
of the (Include) program "LRSBATCHU01".
Source Code Extract
Line
SourceCde
310
endif.
311
l_lnr_callstack = l_lnr_callstack - 1.
312
endloop. " at l_t_callstack
313
endif.
314
315
*---- Eintrag für RSBATCHHEADER -
316
l_s_rsbatchheader-batch_id = i_batch_id.
317
call function 'GET_JOB_RUNTIME_INFO'
318
importing
319
jobcount = l_s_rsbatchheader-jobcount
320
jobname = l_s_rsbatchheader-jobname
321
exceptions
322
no_runtime_info = 1
323
others = 2.
324
call function 'TH_SERVER_LIST'
325
tables
326
list = l_t_server
327
exceptions
328
no_server_list = 1
329
others = 2.
330
data: l_myname type msname2.
331
call 'C_SAPGPARAM' id 'NAME' field 'rdisp/myname'
332
id 'VALUE' field l_myname.
333
read table l_t_server with key
334
name = l_myname.
335
if sy-subrc = 0.
336
l_s_rsbatchheader-host = l_t_server-host.
337
l_s_rsbatchheader-server = l_myname.
338
refresh l_t_server.
339
else.
>>>>>
message x000.
341
endif.
342
data: l_wp_index type i.
343
call function 'TH_GET_OWN_WP_NO'
344
importing
345
subrc = l_subrc
346
wp_index = l_wp_index
347
wp_pid = l_s_rsbatchheader-wp_pid.
348
if l_subrc <> 0.
349
message x000.
350
endif.
351
l_s_rsbatchheader-wp_no = l_wp_index.
352
l_s_rsbatchheader-ts_start = l_tstamps.
353
l_s_rsbatchheader-uname = sy-uname.
354
l_s_rsbatchheader-module_name = l_module_name.
355
l_s_rsbatchheader-module_type = l_module_type.
356
l_s_rsbatchheader-pc_variant = i_pc_variant.
357
l_s_rsbatchheader-pc_instance = i_pc_instance.
358
l_s_rsbatchheader-pc_logid = i_pc_logid.
359
l_s_rsbatchheader-pc_callback = i_pc_callback_at_end.Hi,
i am also getting related this issue kindly see this below short dump description.
Short text
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Note down which actions and inputs caused the error.
To process the problem further, contact you SAP system
administrator.
Using Transaction ST22 for ABAP Dump Analysis, you can look
at and manage termination messages, and you can also
keep them for a long time.
Error analysis
Short text of error message:
Variant RSPROCESS0000000000705 does not exist
Long text of error message:
Diagnosis
You selected variant 00000000705 for program RSPROCESS.
This variant does not exist.
System Response
Procedure
Correct the entry.
Technical information about the message:
Message class....... "DB"
Number.............. 612
Variable 1.......... "&0000000000705"
Variable 2.......... "RSPROCESS"
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
If the error occures in a non-modified SAP program, you may be able to
find an interim solution in an SAP Note.
If you have access to SAP Notes, carry out a search with the following
keywords:
"MESSAGE_TYPE_X" " "
"SAPLRSPC_BACKEND" or "LRSPC_BACKENDU05"
"RSPC_PROCESS_FINISH"
If you cannot solve the problem yourself and want to send an error
notification to SAP, include the following information:
1. The description of the current problem (short dump)
To save the description, choose "System->List->Save->Local File
(Unconverted)".
2. Corresponding system log
Display the system log by calling transaction SM21.
Restrict the time interval to 10 minutes before and five minutes
after the short dump. Then choose "System->List->Save->Local File
(Unconverted)".
3. If the problem occurs in a problem of your own or a modified SAP
program: The source code of the program
In the editor, choose "Utilities->More
Utilities->Upload/Download->Download".
4. Details about the conditions under which the error occurred or which
actions and input led to the error.
System environment
SAP-Release 701
Application server... "CMCBIPRD"
Network address...... "192.168.50.12"
Operating system..... "Windows NT"
Release.............. "6.1"
Hardware type........ "16x AMD64 Level"
Character length.... 16 Bits
Pointer length....... 64 Bits
Work process number.. 0
Shortdump setting.... "full"
Database server... "CMCBIPRD"
Database type..... "MSSQL"
Database name..... "BIP"
Database user ID.. "bip"
Terminal.......... "CMCBIPRD"
Char.set.... "C"
SAP kernel....... 701
created (date)... "Sep 9 2012 23:43:54"
create on........ "NT 5.2 3790 Service Pack 2 x86 MS VC++ 14.00"
Database version. "SQL_Server_8.00 "
Patch level. 196
Patch text.. " "
Database............. "MSSQL 9.00.2047 or higher"
SAP database version. 701
Operating system..... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2, Windows
NT 6.0, Windows NT 6.1, Windows NT 6.2"
Memory consumption
Roll.... 16192
EM...... 4189840
Heap.... 0
Page.... 16384
MM Used. 2143680
MM Free. 2043536
User and Transaction
Client.............. 001
User................ "BWREMOTE"
Language Key........ "E"
Transaction......... " "
Transactions ID..... "9C109BE2C9FBF18BBD4BE61F13CE9693"
Program............. "SAPLRSPC_BACKEND"
Screen.............. "SAPMSSY1 3004"
Screen Line......... 2
Information on caller of Remote Function Call (RFC):
System.............. "BIP"
Database Release.... 701
Kernel Release...... 701
Connection Type..... 3 (2=R/2, 3=ABAP System, E=Ext., R=Reg. Ext.)
Call Type........... "asynchron without reply and transactional (emode 0, imode
0)"
Inbound TID.........." "
Inbound Queue Name..." "
Outbound TID........." "
Outbound Queue Name.." "
Information on where terminated
Termination occurred in the ABAP program "SAPLRSPC_BACKEND" - in
"RSPC_PROCESS_FINISH".
The main program was "SAPMSSY1 ".
In the source code you have the termination point in line 75
of the (Include) program "LRSPC_BACKENDU05".
Source Code Extract
Line SourceCde
45 l_t_info TYPE rs_t_rscedst,
46 l_s_info TYPE rscedst,
47 l_s_mon TYPE rsmonpc,
48 l_synchronous TYPE rs_bool,
49 l_sync_debug TYPE rs_bool,
50 l_eventp TYPE btcevtparm,
51 l_eventno TYPE rspc_eventno,
52 l_t_recipients TYPE rsra_t_recipient,
52 l_t_recipients TYPE rsra_t_recipient,
53 l_s_recipients TYPE rsra_s_recipient,
54 l_sms TYPE rs_bool,
55 l_t_text TYPE rspc_t_text.
56
57 IF i_dump_at_error = rs_c_true.
58 * ==== Dump at error? => Recursive Call catching errors ====
59 CALL FUNCTION 'RSPC_PROCESS_FINISH'
60 EXPORTING
61 i_logid = i_logid
62 i_chain = i_chain
63 i_type = i_type
64 i_variant = i_variant
65 i_instance = i_instance
66 i_state = i_state
67 i_eventno = i_eventno
68 i_hold = i_hold
69 i_job_count = i_job_count
70 i_batchdate = i_batchdate
71 i_batchtime = i_batchtime
72 EXCEPTIONS
73 error_message = 1.
74 IF sy-subrc <> 0.
>>> MESSAGE ID sy-msgid TYPE 'X' NUMBER sy-msgno
76 WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
77 ELSE.
78 EXIT.
79 ENDIF.
80 ENDIF.
81 * ==== Cleanup ====
82 COMMIT WORK.
83 * ==== Get Chain ====
84 IF i_chain IS INITIAL.
85 SELECT SINGLE chain_id FROM rspclogchain INTO l_chain
86 WHERE log_id = i_logid.
87 ELSE.
88 l_chain = i_chain.
89 ENDIF.
90 * ==== Lock ====
91 * ---- Lock process ----
92 DO.
93 CALL FUNCTION 'ENQUEUE_ERSPCPROCESS'
94 EXPORTING
If we do this
Use table RSSDLINIT and in OLTPSOURCRE: enter the name of the data source
And in the logsys enter the name of the source system and delete the entry for that info package. what will happen is process chain will run suceesfully and short dump will not come or what kindly give the detail explanation about this RSSDLINIT.
Regards,
poluru -
Comparison of Data Loading techniques - Sql Loader & External Tables
Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
1) SQL Loader:
a. Place the flat file( .txt or .csv) on the desired Location.
b. Create a control file
Load Data
Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (-- based on requirement) into oracle tablename
Separated by "," (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
c. Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
d. The data can be verified by selecting the data from the table.
Select * from oracle_table;
2) External Table:
a. Place the flat file (.txt or .csv) on the desired location.
abc.csv
1,one,first
2,two,second
3,three,third
4,four,fourth
b. Create a directory
create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
i Number,
n Varchar2(20),
m Varchar2(20)
organization external (
type oracle_loader
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
location ('file.csv')
reject limit unlimited;
d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
• Access data in external sources as if it were in a table in the database.
• Merge a flat file with an existing table in one statement.
• Sort a flat file on the way into a table you want compressed nicely
• Do a parallel direct path load -- without splitting up the input file, writing
Shortcomings:
• External tables are read-only.
• No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
• Load the data from a stored procedure or trigger (insert is not sqlldr)
• Do multi-table inserts
• Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
Conclusion:
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.Please let me know your views on this.
-
Short Dump Error While Data load
Hi Experts,
Data load to an ODS in production server has been failed because of Short dump error. The error message shows " OBJECTS_OBJREF_NOT_ASSIGNED ".
Request you to please help me out with a solution ..
Regards,
VijayHi Vijay,
follow the steps below (May Help you)
Goto Monitor screen>Status Tab> using the wizard or the menu path >Environment -> Short dump> In the warehouse
Select the Error and double click
Analyse the error from the message.
1.-->Go to Monitor
-->Transactional RFC
-->In the Warehouse
-->Execute
-->EDIT
-->Execute LUW
Refresh the Transactional RFC screen and go to the Data Target(s) which have failed and see the status of the Request-->Now it should be green.
2.In some cases the above process will not work (where the Bad requests still exists after the above procedure).
In that case we need to go to the Data Targets and need to delete the bad requests and do the update again
Regards,
BH -
Data Load to BI (7.0 SP 9) from R3(ECC 6.0 SP-Basis 9)
Dear All,
We have new instance of Devlopment BW System with version 7.0 and R/3 upgraded to ECC6.0. We connected the source system. When we extract the data through DTP the data load is sucessful with 0 Records.
This is case with all the extractors.
The data base is on Oracle 10.2
Observations for this:
0) Source system connection check OK.
1) When I test in RSA3 for the same extract I could fetch some data there.
2) I could transfer the global setting
3) I could not see any of the iDoc generated in BW and received in R/3
4) No back ground job is generated in R/3 in SM37
5) I could extract the data from other Source System(SEM) instance based on 3.5 technology.
As a progress on this issue I could load the load sucessfully by 3.X methodolgy (Using update rules) but not by BI7.0 Methodology(Using Transformation).
As a standards by the client we have to use 7.0 Methodology so still need to find a solution for the same.
No clue on how to solve and what is going on wrong. Help me in solve this issue.
Thanks in Advance,
PV
Message was edited by:
USERPVI am not sure if you have followed all the necessary steps to do a data load to the infocube. I also wish I had more information about your system and the error message you are getting. A data load can fail due to a variety of reasons -- depending on the BW version, system settings and the procedure you followed. Please use the data load monitor transaction rsmo, identify the error message and take the necessary action.
if it may be useful reward point are appreciated -
How we can automate the data loading from BI-BPC
Dear Guru's
Thanks for watching this thread,my question is
How we can load the data from BI7.0 to BPC.My environment is SAP-BI 7.0 and BPC is 7.5 MS version and 2008SQL.
How we can automate the data loading from BI- BPC Ms version.Is manual flat file load is mandatory in ms version.
Thanks in Advance,
Srinivasan.Here are some options
1) Use standars packages and schedule them :
A) Openhub masterdata file into a flat file/ BPC App server and Schedule the package - Import Master Data from a Data File and other relevent packages.
2 ) Using Custom Tasks in Custom Packages ( SSIS)
Procedure
From the Microsoft SQL Server Business Intelligence Developer Studio, open the Microsoft SSIS folder.
Create a new package, or select an existing package to modify.
Choose Task Register Custom Task .
In the Task Location field, browse for the target .dll file.
Note
By default, the .dll files are stored in BPC/Websrvr/bin.
End of the note.
Enter a task description, select an appropriate icon, then click OK.
Drag the icon to the designer window. Enter data as required.
Save the package.
Maybe you are looking for
-
How to resolve "movie was not copied to the iPod because it cannot be played on this iPod".
how to resolve "movie was not copied to the iPod because it cannot be played on this iPod".
-
Apple TV no longer appears in devices list
Hey all: I'm having a very frustrating problem with ATV and iTunes. I can no longer see the Apple TV in the devices list. Now, I know that there are two Apple Support KB documents that cover this issue. I've already gone through everything in those.
-
Use of Table Valued Parameter to restore databases
I'm a noob with table valued parameters. Not sure if I can use TVP for what I need to do. I want to restore/refresh multiple databases from arbitrary number of .BAK files. I can successfully populate a TVP with the needed source information which
-
External HD not being recognized
I'm running a Seagate Free Agent Go FW 250GB external HD to back my computer up. Only thing is it isn't being read by my Macbook Pro. The last time I was able to back up my computer was 30 JAN 2012. I'm connecting it to my computer via the two includ
-
Schedule Role Upload in portal
Hi expert, Just receive a requirment to code an WD application in orde to trigger the Role Upload functionality in portal. Anyone has some idea's how to do this ? Is this possible ? An WD calling a HTMLB application? how ? any code snippets are welco