CALL_FUNCTION_CONFLICT_TYPE Standard Data loading
Hi,
I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
I apologise for the long message below... this is a part of the system log.
For information the routine_0004 is a standard one.
Thanks a lot in advanced!
Cheers.
CALL_FUNCTION_CONFLICT_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
Symptoms. Type conflict when calling a function module
Causes Error in ABAP application program.
The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.
This is probably due to an error in the ABAP program.
A function module was called incorrectly.
Errors analysis
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"
"(FORM)" .
Since the caller of the procedure could not have expected this exception
to occur, the running program was terminated. The reason for the exception is:
The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
The field "RESULT" specified here is a different field type.
How to correct the error.
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
"CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC
"GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"
"ROUTINE_0004"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)". 2. A suitable printout of the system log To obtain this, call the system log through transaction SM21. Limit the time interval to 10 minutes before and 5 minutes after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs, supply the source code.
To do this, select the Editor function "Further Utilities-> Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
The exception must either be prevented, caught within the procedure
"ROUTINE_0004"
"(FORM)", or declared in the procedure's RAISING clause.
To prevent the exception, note the following:
Environment system SAP Release.............. "640"
Operating system......... "SunOS" Release.................. "5.9"
Hardware type............ "sun4u"
Character length......... 8 Bits
Pointer length........... 64 Bits
Work process number...... 2
Short dump setting....... "full"
Database type............ "ORACLE"
Database name............ "BWI"
Database owner........... "SAPTB1"
Character set............ "fr"
SAP kernel............... "640"
Created on............... "Jan 15 2006 21:42:36" Created in............... "SunOS 5.8 Generic_108528-16 sun4u"
Database version......... "OCI_920 "
Patch level.............. "109"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."
SAP database version..... "640"
Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
SAP Release.............. "640"
The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in
"ROUTINE_0004".
The main program was "RSMO1_RSM2 ".
The termination occurred in line 702 of the source code of the (Include)
program "GP420EQ35FHFOCVEBCR6RWPVQBR"
of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in the RAISING clause of the procedure.
The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
672 'ROUTINE_0003' g_s_is-recno
673 rs_c_false rs_c_false g_s_is-recno
674 changing c_abort.
675 catch cx_foev_error_in_function.
676 perform error_message using 'RSAU' 'E' '510'
677 'ROUTINE_0003' g_s_is-recno
678 rs_c_false rs_c_false g_s_is-recno
679 changing c_abort.
680 endtry.
681 endform.
682 ************************************************************************
683 * routine no.: 0004
684 ************************************************************************
685 form routine_0004
686 changing
687 result type g_s_hashed_cube-FISCPER3
688 returncode like sy-subrc
689 c_t_idocstate type rsarr_t_idocstate
690 c_subrc like sy-subrc
691 c_abort like sy-subrc. "#EC *
692 data:
693 l_t_rsmondata like rsmonview occurs 0 with header line. "#EC *
694
695 try.
696 * init
variables
697 move-corresponding g_s_is to comm_structure.
698
699 * fill the internal table "MONITOR", to make monitor entries
700
701 * result value of the routine
>>>> CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
703 EXPORTING
704 I_TIMNM_FROM = '0CALDAY'
705 I_TIMNM_TO = '0FISCPER'
706 I_TIMVL = COMM_STRUCTURE-CALDAY
707 I_FISCVARNT = gd_fiscvarnt
708 IMPORTING
709 E_FISCPER = RESULT.
710 * if the returncode is not equal zero, the result will not be updated
711 RETURNCODE = 0.
712 * if abort is not equal zero, the update process will be canceled
713 ABORT = 0.
714
715 catch cx_sy_conversion_error
716 cx_sy_arithmetic_error.
717 perform error_message using 'RSAU' 'E' '507'
718 'ROUTINE_0004' g_s_is-recno
719 rs_c_false rs_c_false g_s_is-recno
720 changing c_abort.
721 catch cx_foev_error_in_function.
System zones content
Name Val.
SY-SUBRC 0
SY-INDEX 2
SY-TABIX 0
SY-DBCNT 0
SY-FDPOS 65
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY 0400
SY-UCOMM OK
SY-TITLE Moniteur - Atelier d'administration
SY-MSGTY E
SY-MSGID RSAU
SY-MSGNO 583
SY-MSGV1 BATVC 0000000000
SY-MSGV2 0PROJECT
SY-MSGV3
SY-MSGV4
Selected variables
Nº 23 Tpe FORM
Name ROUTINE_0004
GD_FISCVARNT
22
00 RS_C_INFO I
4
9
COMM_STRUCTURE-CALDAY
20060303
33333333
20060303
SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR 4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
RESULT
000
333
00
You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
DATA: var type <the same that e_fiscper in FM definition>
CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
EXPORTING
I_TIMNM_FROM = '0CALDAY'
I_TIMNM_TO = '0FISCPER'
I_TIMVL = COMM_STRUCTURE-CALDAY
I_FISCVARNT = gd_fiscvarnt
IMPORTING
E_FISCPER = var.
result = var.
--- ASSIGN POINTS IS USEFUL.
Similar Messages
-
Standard Data Load SQL is blank
Good day guys,
I'm trying to perform a data load in IES. I noticed that the "Standard Data Load SQL" field in the SQL Override window is blank. May I know what causes this and how can I fix this?
Uploaded with ImageShack.us
Thanks in advance.
Edited by: 26FEB1986 on Dec 14, 2010 9:15 PMUP.
-
Missing Standard Dimension Column for data load (MSSQL to Essbase Data)
This is similar error to one posted by Sravan -- however I'm sure I have all dimensions covered -- going from MS SQL to SunOpsys Staging to Essbase. It is telling me missing standard dimension, however I have all accounted for:
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last): File "<string>", line 23, in ? com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
I'm using multiple time period inputs -- BegBalance,Jul,Aug,Sep,Oct,Nov,Dec,Jan,Feb,Mar,Apr,May,Jun (target has all of those in place of Time Periods)
I'm using hard coded input mapping for Metric, Scenario, Version, HSP_Rates and Currencies. -> 'Amount', 'Actual', 'Final', 'HSP_InputValue','Local' respectively.
The only thing I can think of is that since I'm loading to each of the months in the Time Periods dimension (the reversal was set up to accomodate that)... and now its somehow still looking for that? Time Periods as a dimension does not show up in the reversal -- only the individual months named above.
Any ideas on this one??John -- I extracted the data to a file and created a data load rule in Essbase to load the data. All dimensions present and accounted for (five header items as similar here) and everything loads fine.
So not sure what else is wrong -- still getting the missing dimension error.
Any other thoughts?? Here's the entire error message. Thanks for all your help on this.
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
at org.python.pycode._pyx8.f$0(<string>:23)
at org.python.pycode._pyx8.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java)
at org.python.core.PyCode.call(PyCode.java)
at org.python.core.Py.runCode(Py.java)
at org.python.core.Py.exec(Py.java)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.k(e.java)
at com.sunopsis.dwg.cmd.g.A(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.validateColumns(Unknown Source)
... 32 more
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Missing standard dimension column for data load
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.k.a(k.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.k(e.java)
at com.sunopsis.dwg.cmd.g.A(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
Suggest good strategy for data load through standard datasources
Hi BW Gurus,
we currently are using standard purhasing related datasources. We forsee new reports coming in later based on the standard datasources.
Can you please suggest a good general startegy to follow to bring in R/3 data. Our concerns are towards data loads [ initializations etc..] as some of the standard datasources are already in production.
please advice.Hi
go through these web-blogs - From Roberto Negro it may help you.
/people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
/people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
/people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
Regards,
Rajesh. -
Data load problem - BW and Source System on the same AS
Hi experts,
Im starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
Ive configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, Ive created an InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 source system). Ive started the data load process, but the monitor says that no Idocs arrived from the source system and keeps the status yellow forever.
Additional information:
<u><b>BW Monitor Status:</b></u>
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System Response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
<b><u>BW Monitor Details:</u></b>
0 from 0 records
but there are 2 records on RSA3 for this data source
Overall status: Missing messages or warnings
- Requests (messages): Everything OK
o Data request arranged
o Confirmed with: OK
- Extraction (messages): Missing messages
o Missing message: Request received
o Missing message: Number of sent records
o Missing message: Selection completed
- Transfer (IDocs and TRFC): Missing messages or warnings
o Request IDoc: sent, not arrived ; Data passed to port OK
- Processing (data packet): No data
<b><u>Transactional RFC (sm58):</u></b>
Function Module: IDOC_INBOUND_ASYNCHRONOUS
Target System: SRMCLNT100
Date Time: 08.03.2006 14:55:56
Status text: No service for system SAPSRM, client 001 in Integration Directory
Transaction ID: C8C415C718DC440F1AAC064E
Host: srm
Program: SAPMSSY1
Client: 001
Rpts: 0000
<b><u>System Log (sm21):</u></b>
14:55:56 DIA 000 100 BWREMOTE D0 1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
Documentation for system log message D0 1 :
The transaction has been terminated. This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction. The actual reason for the termination is indicated by the T100 message and the parameters.
Additional documentation for message IDOC_ADAPTER 601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
<b><u>RFC Destinations (sm59):</u></b>
Both RFC destinations look fine, with connection and authorization tests successful.
<b><u>RFC Users (su01):</u></b>
BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
Someone could help ?
Thanks,
GuilhermeGuilherme
I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
Also check this weblog on data Load errors basic checks. it may help
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Thanks
Sat -
What is a Standard Data Source for table T006A?
Dear Experts,
As a part of one requirement I need to pull the data from Table T006A which is available in BW as well.
I did many search on forums before posting the same but didnt get clear idea.
I am aware how Units are being updated withing BW but I need to pull the text for all Units within BW.
Do we have any standard Data Source which pulls the data from T006A table?
Thanks in Advance,
NileshDear Raf Boudewijns,
Requirement is to load the Unit texts(available in table T006A) into one custom InfoObject.
I know this table is already pulled within BW and frequently being updated but didnt get any Standard Data Source which fetches the data from T006A table.
I can create a new Generic Data Source within BW itself based on table T006A. But would like to use Standard Data Source if its available else will have to create a new generic Data Source.
Thanks,
Nilesh -
Inventory snapshot scenario - Data load frequency?
Hi,
I have gone through "How to" document for inventory snapshot extraction.
Here are few questions for which I could not find answers in the document -
1. Process 1 loads initial stock using BX data source into ODS.
2. Then Process 2 - I assume there are two steps in this -
a) Init using BF/UM Data source - If this is done, historical movements get added to initial stock in this ODS, which yields wrong results. So, is mapping to ODS required while doing init from BF/UM data sources? Init(Process 3) adds stock into snapshot cube for all the months from date of movement to current(system) date.
b) Delta using BF/UM Data source - Adding delta to snapshot ODS makes sense. Is it also required to load this parallelly to snapshot cube(as mentioned in process 3)?
No intention to confuse anybody.. Just wanted to know which of the following data load scenario yields perfectly fine results? The document is not self explainatory on this topic -
I assume that, 0IC_C03 is being updated in parallel.
1. Initial stock load using BX datasource into ODS. Then Initialize BF/UM data source into snapshot cube ONLY(Does this actually store historical snapshot?). Then do delta from BF/UM to ODS ONLY. Then follow rest of the steps.
2. Initial stock load using BX datasource into ODS. Then Initialize BF/UM data source into snapshot cube and ODS. Then do delta from BF/UM to snapshot cube AND ODS. Then follow rest of the steps.
3. Initial stock load using BX datasource into ODS. Initialize BF/UM data source WITHOUT DATA TRANSFER. Then start delta into snapshot cube AND ODS.
Any help on this will be greatly appreciated.
Thanks and Regards,
AnupHi,
Ensure that the 3 key figures inlcuded in communication structure (of course it will get inluded as it is standard datasource)and when u create the update rules,this 3 key figures will set to no update and then you have to populate the 3 kf's using a routine.
Hope this helps.
Thanks,Ramoji. -
Data Load to BI (7.0 SP 9) from R3(ECC 6.0 SP-Basis 9)
Dear All,
We have new instance of Devlopment BW System with version 7.0 and R/3 upgraded to ECC6.0. We connected the source system. When we extract the data through DTP the data load is sucessful with 0 Records.
This is case with all the extractors.
The data base is on Oracle 10.2
Observations for this:
0) Source system connection check OK.
1) When I test in RSA3 for the same extract I could fetch some data there.
2) I could transfer the global setting
3) I could not see any of the iDoc generated in BW and received in R/3
4) No back ground job is generated in R/3 in SM37
5) I could extract the data from other Source System(SEM) instance based on 3.5 technology.
As a progress on this issue I could load the load sucessfully by 3.X methodolgy (Using update rules) but not by BI7.0 Methodology(Using Transformation).
As a standards by the client we have to use 7.0 Methodology so still need to find a solution for the same.
No clue on how to solve and what is going on wrong. Help me in solve this issue.
Thanks in Advance,
PV
Message was edited by:
USERPVI am not sure if you have followed all the necessary steps to do a data load to the infocube. I also wish I had more information about your system and the error message you are getting. A data load can fail due to a variety of reasons -- depending on the BW version, system settings and the procedure you followed. Please use the data load monitor transaction rsmo, identify the error message and take the necessary action.
if it may be useful reward point are appreciated -
Problem converting static data load mapping to MOLAP
Hi
as a prototyping exercise I am converting some of our ROLAP dimensions and corresponding data load mappings (1 static data i.e. "-1" id to handle unknowns in fact data, and 1 for real data coming from a table) to MOLAP.
The dimension itself converts and deploys correctly and the real data mapping also redeploys and executes correctly.
HOWEVER
my static data mapping will not execute successfully.
The mapping uses constants (ID = -1, NAME 'UNKNOWN' etc), not all attributes are linked (this has been tried). My column WH_ID which was the ROLAP surrogate key gets converted to VARCHAR2 as expected. Mapping does deploy cleanly.
The error i get is below. I have been banging my head on this for a couple of days and tried searching the Net, Metalink to no avail. I'm hoping someone out there can help
LOAD_STATIC_D_TRADER_IU
Warning
ORA-20101: 15:48:04 ***Error Occured in BUILD_DRIVER: In __XML_SEQUENTIAL_LOADER: In __XML_LOAD_ATTRS: Error loading attributes for hierarchy, D_TRADER.AW$NONE.HIERARCHY, level D_TRADER.TRADER.LEVEL, mapping group D_TRADER.TRADER.MAPGROUP1.DIMENSIONMAPGROUP. In __XML_LOAD_ATTRS_ITEM: In ___XML_LOAD_TEMPPRG: The SQL IMPORT command cannot convert from the TEXT type to the DECIMAL type.
TRUNCATE_LOAD=false
AW Execution status: Success
15:48:00 Started Build(Refresh) of MARTS Analytic Workspace.
15:48:00 Attached AW MARTS in RW Mode.
15:48:01 Started Loading Dimensions.
15:48:01 Started Loading Dimension Members.
15:48:01 Started Loading Dimension Members for D_TRADER.DIMENSION (1 out of 1 Dimensions).
15:48:03 Finished Loading Members for D_TRADER.DIMENSION. Added: 1. No Longer Present: 885.
15:48:03 Finished Loading Dimension Members.
15:48:03 Started Loading Hierarchies.
15:48:03 Started Loading Hierarchies for D_TRADER.DIMENSION (1 out of 1 Dimensions).
15:48:03 Finished Loading Hierarchies for D_TRADER.DIMENSION. 1 hierarchy(s) STANDARD Processed.
15:48:03 Finished Loading Hierarchies.
15:48:03 Started Loading Attributes.
15:48:03 Started Loading Attributes for D_TRADER.DIMENSION (1 out of 1 Dimensions).
15:48:04 Failed to Build(Refresh) MARTS Analytic Workspace.
15:48:04 ***Error Occured in BUILD_DRIVER: In __XML_SEQUENTIAL_LOADER: In __XML_LOAD_ATTRS: Error loading attributes for hierarchy, D_TRADER.AW$NONE.HIERARCHY, level D_TRADER.TRADER.LEVEL, mapping group D_TRADER.TRADER.MAPGROUP1.DIMENSIONMAPGROUP. In __XML_LOAD_ATTRS_ITEM: In ___XML_LOAD_TEMPPRG: The SQL IMPORT command cannot convert from the TEXT type to the DECIMAL type.Hi this looks like a bug in set based mode with using numeric dimension attributes and loading them from a constant. Row based mode is OK, which stages the data before loading the AW, but you probably don't want this.
A workaround is to add an expression operator in the map. You will have to add a link from a source table/constant into the expression operator to satisfy the map analyser. But then you can add expressions such as your numeric attributes in the expression operator's output group, define the values for each expression and map these expression outputs (not the numeric constants) into your dimension. Hopefully this makes sense.
Cheers
David -
Master Data load does not extract Hierarchy nodes in BPC Dimension ACCOUNT
Hi Experts,
I am performing master data load through standard DM package with Filter selection as:
1. Chart of Accounts
2. Hieararchy selection has 4 hierarchy names
3. Selected Import Text nodes
4. Selected Set Filters by Attribute OR Hierarchies
I have run this DM package for a set of data and selections a week ago and it worked fine.
However when i run it now, it is giving issues,
It extracts any new GL maintained in the BI system however it does not extract any hierarchy nodes at all! (Have tested this by deleting the hierarchy nodes and tried to run the master data load)
I am running the DM package in Update and have selection as External.
Any sugestions for checks / has anyone encountered this issue earlier?
Regards,
Shweta SalpeHi Guyz,
Thanks.
I found that the issue was with the transformation file where i was maintaining the RATETYPE.
When i removed the mapping of RATETYPE this works fine. (Pulls the nodes of hierarchies)
however now i do not have Ratetype populated in the system.
my rate type mapping is:
RATETYPE=*IF(ID(1:1)=*STR(C) then *STR(TOSKIP);ID(1:1)=*STR(H) then *STR(TOSKIP);ID)
and in conversion file i have TOSKIP *skip
I have to skip the ratetypes for the hierarchy nodes and my hierarchy nodes start with C and H.
So now that i have removed the mapping for RATETYPE can anyone suggest me a correct way to achieve this? (Note the above mapping formula was skipping all of the hierarchy nodes starting with C and H)
Regards,
Shweta Salpe -
Statistic on throughput of data loader utility
Hi All
Can you guys share some statistics on throughput of data loader utility ? If you are looking for number of records you may consider 1 Million, how long it would take to import this ?.
I need these number to make a call on using Web Service or Data loader utility. Any suggestion is appreciated.
Thank you.It really depends on the object and the amount of data in there (both the number of fields you are mapping, and how much data is in the table).
For example…
One of my clients has over 1.2M Accounts. It takes about 3 hours (multi-tenant) to INSERT 28k new customers. But when we were first doing it, it was sub-1hour. Because the bulk loader is limited on the record count (most objects are limited to 30k records in the input file), you will need to break up your file accordingly.
But strangely, the “Financial Account” object (not normally exposed in the standard CRMOD), we can insert 30k records in about 30 min (and there are over 1M rows in that table). Part of this is probably due to the number of fields on the account and the address itself (remember it is a separate table in the underlying DB, even though it looks like there are two address sets of fields on the account).
The bulk loader and the wizard are roughly the same. However, the command line approach doesn’t allow for simultaneously INSERT/UPDATE (there are little tricks around this; depends how you might prepare the extract files from your other system... UPDATE file and a INSERT file, some systems aren't able to extract this due to the way they are built).
Some objects you should be very careful with because the way the indexes are built. For example, ASSET and CONTACT both will create duplicates even when you have an “External Unique Id”. For those, we use web services. You aren’t limited to a file size there. I think (same client) we have over 800k ASSETS and 1.5M CONTACTS.
The ASSET load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 6 hours.
The CONTACT load (via webservice which does both INSERT and UPDATE) typically can insert about 40k records in about 10 hours.
Your best shot is to do some timings via the import wizard and do a little linear time increase as you increase the data size sitting in the tables.
My company (Hitachi Consulting) can help build these things (both automated bulk loaders and web services) if you are interested due to limited resource bandwidth or other factors. -
GL Account data load - Struggle , Pls help
Hi all ,
I am mapping the standard infoObject 0GL_ACCOUNT
to an R/3 field of type HKONT . When i try to
activate the data loaded in ODS , I get an error
saying no SID exists for ' 0003223311'. when i check
the contents in the ODS , I can see the GL Account
'0003223311' . In R/3 table ( data source ) , I see
the GL account as '0003223311' when i select Standard
list in user parameters . However when i tick option ,
'Check conversion exits' , I see the GL Acct as '3223311 '.
Please help me resolve this issue .
Thanks allwhen i look at the cost center SID table , I see that the key for the table is a combination of Cost Center and
Controlling area . Does that mean that the corressponding SIDs are for the combination value ? for ex : if in
the cost center table i have a sid of 'xxxx' for cost center '9999' and co area 'abc' combination .
Now i have an ODS which has only cost center(9999) filled , therefore when i try to activate the data in the ODS
it tells me 'SID not found for cost center 9999' , Is that because the key to the SID table is a combo of Cost cen and Co area. Do i need to have both populated in my ODS ? -
Incremental Data loading in ASO 7.1
HI,
As per the 7.1 essbase dbag
"Data values are cleared each time the outline is changed structurally. Therefore, incremental data loads are supported
only for outlines that do not change (for example, logistics analysis applications)."
That means we can have the incremental loading for ASO in 7.1 for the outline which doesn't change structurally. Now what does it mean by the outline which changes structurally? If we add a level 0 member in any dimension, does it mean structrual change to that outline?
It also syas that adding Accounts/Time member doesn't clear out the data. Only adding/deleting/moving standard dimension member will clear out the data. I'm totally confused here. Can anyone pls explain me?
The following actions cause Analytic Services to restructure the outline and clear all data:
● Add, delete, or move a standard dimension member
● Add, delete, or move a standard dimension
● Add, delete, or move an attribute dimension
● Add a formula to a level 0 member
● Delete a formula from a level 0 member
Edited by: user3934567 on Jan 14, 2009 10:47 PMAdding a Level 0 member is generally, if not always, considered to be a structural change to the outline. I'm not sure if I've tried to add a member to Accounts and see if the data is retained. This may be true because by definition, the Accounts dimension in an ASO cube is a dynamic (versus Stored) hierarchy. And perhaps since the Time dimension in ASO databases in 7.x is the "compression" dimension, there is some sort of special rule about being able to add to it -- although I can't say that I ever need to edit the Time dimension (I have a separate Years dimension). I have been able to modify formulas on ASO outlines without losing the data -- which seems consistent with your bullet points below. I have also been able to move around and change Attribute dimension members (which I would guess is generally considered a non-structural change), and change aliases without losing all my data.
In general I just assume that I'm going to lose my ASO data. However, all of my ASO outlines are generated through EIS and I load to a test server first. If you're in doubt about losing the data -- try it in test/dev. And if you don't have test/dev, maybe that should be a priority. :) Hope this helps -- Jason. -
Data load through DTP giving Error while calling up FM RSDRI_INFOPROV_READ
Hi All
We are trying to load data in Cube through DTP from DSO. In the Transformation, we are looking up Infocube data through SAP Standard Function Module 'RSDRI_INFOPROV_READ'. The Problem we are facing is that our loads are getting failed & it is giving error as 'Unknown error in SQL Interface' & Parallel process error.
In the DTP, We have Changed the No. of Parallel processes from 3 (default) to 1 but still the above issue exists with data loads.
We had similar flow developed in 3.5 (BW 3.5 Way) where we had used this Function Module 'RSDRI_INFOPROV_READ' & there our data loads are going fine.
We feel there is compatability issue of this FM with BI 7.0 data flows but are not sure. If anybody has any relevant inputs on this or has used this FM with BI 7.0 flow then please let me know.
Thanks in advance.
Kind Regards
SwapnilHello Swapnil.
Please check note 979660 which mentions this issue ?
Thanks,
Walter Oliveira. -
Data load into SAP ECC from Non SAP system
Hi Experts,
I am very new to BODS and I have want to load historical data from non SAP source system into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
Regards,
MonilHi
In order to load into SAP you have the following options
1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
Customer Master - DEBMAS
Article Master - ARTMAS
Material Master - MATMAS
Vendor Master - CREMAS
Purchase Info Records (PIR) - INFREC
The list is endless.........
In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
Do let me know if you need more info on any specific queries or issues you may encounter.
kind regards
Raghu
Maybe you are looking for
-
Has the "You shut down your computer because of a problem" message changed?
When I restart the computer after a Kernal Panic I get the message "You shut down your computer because of a problem. Click Report to see detailed information and send a report to Apple"; 2 buttons are shown: "Ignore" and "Report...". For some time n
-
Javascript to adjust form field font size by number of entered characters?
I have a form field that asks for 'full name' (first, middle, last). It's posted large at the top of the form. Some people's names are short, some long. Is there a javascript way to adjust the font size used within a form field based on the number of
-
Error when sending E - m-a-i-l through PI
Hi all, I am doing a file to M-a-i-l scenario in which my M-a-i-l communication channel is throwing an error when i try to sending an E-M-a-i-l. Find the below failure message. Mail: call failed; java.io.IOException: server not responding OK to M-A-I
-
Install Flash Player 13.0.0.214 hangs at 95%
Hi All, there seems to be a recurring problem on installing Flash Player, I had exactly the same issues with Flash 12, is there a fix for this yet or must I go back to Flash 11. Mac OS X 10.6.8 Mac Pro Intel Xeon 2.66
-
There is no location bar when I open Firefox, how do I get it back?
I installed speed dial. Now there is no location bar at the top of the page