Troubleshooting 9.3.1 Data Load
I am having a problem with a data load into Essbase 9.3.1 from a flat, pipe-delimited text file, loading via a load rule. I can see an explicit record in the file but the results on that record are not showing up in the database.
* I made a special one-off file with the singular record in question and the data loads properly and is reflected in the database. The record itself seems to parse properly for load.
* I have searched the entire big file (230Mb) for the same member combination, but only come up with this one record, so it does not appear to be a "last value in wins" issue.
* Most other data (610k+ rows) appears to be loading properly, so the fields, in general, are being properly parsed out in the load rule. Additionally, months of a given item are on separate rows, and other rows of the same item are loading properly and being reflected in the database. As well as other items are being loaded properly in the months where this data loads to, so, it is not a metadata-not-existing issue.
* The load is 100% successful according to the non-existent error file. Also, loading the file interactively results in the file showing up under "loaded successfully" (no errors).
NOTE:
The file's last column does contain item descriptions which may include special characters including periods and quotes and other special characters. The load rule moves the description field to the earlier in the columns, but the file itself has it last.
QUESTION:
Is it possible that the a special character (quote??) in a preceding record is causing the field parsing to include the CR/LF, and therefore the next record, into one record? I keep thinking that if the record seems to fine alone, but is not fine where it sits amongst other records, that it may have to do with preceding or subsequent records.
THOUGHTS??
Thanks Glenn. I was too busy looking for explicit members that I neglected thinking through implicit members. I guess I was thinking that implied members don't work if you have a rules file that parses out columns...that a missing member would just error out a record instead of using the last known value. In fact, I thought that (last known value) only worked if you didn't use a load rule.
I would prefer some switch in Essbase that requires keys in all fields in a load rule or allows last known value.
Similar Messages
-
Troubleshooting data load procedure
Hi folks,
I am trying to load data into a model built in cube builder. My data load procedure, which is for a single variable, is failing. Here are details:
Dim structure as follows:
DIM CHANNEL:
SALES_CHANNEL (Output)
SUB_CHANNEL (Input)
DIM BUSINESS_UNIT:
BUS_UNIT (Output)
BDM (Input)
Data is in an MS Access table called FACT_NETSALES, with columns SALES_CHANNEL, SUB_CHANNEL, BUS_UNIT, BDM, and time ranging from Oct 2009 to November 2010.
Data sample:
SALES_CHANNEL SUB_CHANNEL BUS_UNIT BDM Oct-09 Nov-09 Dec-09
CH1 CH1SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB2 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH2 CH2SUB3 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH3 CH3SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH4 CH4SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB1 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB2 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH5 CH5SUB3 BUS1 BUS1BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS2 BUS2BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS3 BUS3BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS3 BUS3BDM2 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS4 BUS4BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS5 BUS5BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS6 BUS6BDM1 999999.99 999999.99 999999.99
CH6 CH6SUB1 BUS7 BUS7BDM1 999999.99 999999.99 999999.99
Here's the procedure:
CLEAR STATUS
USE INITIAL RETAIN
SET Control DB_Name CLIENTTEST
USE &DB_Name UPDATE
CHE UPD
SET Control App_Periodicity MONTHLY
SET Control App_Period October 2009 - November 2010
SET DATE MDY
SET &App_Periodicity
SET PERIOD &App_Period
SET Control App_FACT_Table FACT_NETSALES
SET Control DW_Link LNK_CLIENT
SELECT VAR KPI1_ACT
SELECT CHANNEL
SELECT BUSINESS_UNIT
ACROSS TIME DOWN CHANNEL, BUSINESS_UNIT, VAR
ACCESS LSLINK
CONNECT &DW_Link
BEGIN
SELECT
SALES_CHANNEL,
SUB_CHANNEL,
BUS_UNIT,
BDM,
DATE
FROM &App_FACT_Table
END
Peek only 10
LSS CREATE CHANNEL = SALES_CHANNEL
READ
END
The error I get is:
[Microsoft][ODBC Microsoft Access Driver] Too few parameters. Expected 1.
SQLSTATE: 07001
SQL System code: -3010
What's wrong with the procedure?
Thanks in advance for your help!I figured it out. I was selecting two levels of a single dimension in the SQL statement.
-
Data load problem - BW and Source System on the same AS
Hi experts,
Im starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
Ive configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, Ive created an InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 source system). Ive started the data load process, but the monitor says that no Idocs arrived from the source system and keeps the status yellow forever.
Additional information:
<u><b>BW Monitor Status:</b></u>
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System Response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
No Idocs arrived from the source system.
<b><u>BW Monitor Details:</u></b>
0 from 0 records
but there are 2 records on RSA3 for this data source
Overall status: Missing messages or warnings
- Requests (messages): Everything OK
o Data request arranged
o Confirmed with: OK
- Extraction (messages): Missing messages
o Missing message: Request received
o Missing message: Number of sent records
o Missing message: Selection completed
- Transfer (IDocs and TRFC): Missing messages or warnings
o Request IDoc: sent, not arrived ; Data passed to port OK
- Processing (data packet): No data
<b><u>Transactional RFC (sm58):</u></b>
Function Module: IDOC_INBOUND_ASYNCHRONOUS
Target System: SRMCLNT100
Date Time: 08.03.2006 14:55:56
Status text: No service for system SAPSRM, client 001 in Integration Directory
Transaction ID: C8C415C718DC440F1AAC064E
Host: srm
Program: SAPMSSY1
Client: 001
Rpts: 0000
<b><u>System Log (sm21):</u></b>
14:55:56 DIA 000 100 BWREMOTE D0 1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
Documentation for system log message D0 1 :
The transaction has been terminated. This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction. The actual reason for the termination is indicated by the T100 message and the parameters.
Additional documentation for message IDOC_ADAPTER 601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
<b><u>RFC Destinations (sm59):</u></b>
Both RFC destinations look fine, with connection and authorization tests successful.
<b><u>RFC Users (su01):</u></b>
BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
Someone could help ?
Thanks,
GuilhermeGuilherme
I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
Also check this weblog on data Load errors basic checks. it may help
/people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
Thanks
Sat -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Issue:
I have SAP BW system and SAP HANA System
SAP BW to SAP HANA connecting through a DB Connection (named HANA)
Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
Executed the Open Hub service without checking DELETING Data from table option
Data loaded with 16 Records from BW to HANA same
Second time again executed from BW to HANA now 32 records came ( it is going to append )
Executed the Open Hub service with checking DELETING Data from table option
Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
If checking in SAP BW system tio SAP BW system it is working fine ..
will this option supports through DB Connection or not ?
Please follow the attachemnet along with this discussion and help me to resolve how ?
From
Santhosh KumarHi Ramanjaneyulu ,
First of all thanks for the reply ,
Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
in that there is check box i have selected already that is what my issue even though selected also
not performing the deletion from target level .
SAP BW - to SAP HANA via DBC connection
1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
2. second time again executed from BW - now hana side appaended means 16+16 = 32
3. so that i used to select the check box at OH level like Deleting data from table
4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
Thanks
Santhosh Kumar -
Hi,
I have a question regarding data loads. We have a process cahin which includes 3 ods and cube.
Basically ODS A gets loaded from R/3 and the from ODS A it then loads into 2 other ODS B, ODS C and CUBE A.
So when I went to monitor screen of this load ODS A-> ODS B,ODS C,CUBE A. The total time shows as 24 minutes.
We have some other steps in process chain where ODS B-> ODS C, ODS C- CUBE 1.
When I go the monitor screen of these data loads the total time the tortal time for data loads show as 40 minutes.
I *am suprised because the total run time for the chain itself is 40 minutes where in the chain it incclude data extraction form R/3 and ODS's Activations...indexex....
*Can anybody throw me some light?
Thank you all
Edited by: amrutha pal on Sep 30, 2008 4:23 PMHi All,
I am not asking like which steps needed to be included in which chain.
My question is when look at the process chain run time it says the total time is equal to 40 minutes and when you go RSMO to check the time taken for data load from ODS----> 3 other data targets it is showing 40 minutes.
The process chain also includes ods activation buliding indexex,extracting data from R/3.
So what are times we see when you click on a step in process chain and displaying messages and what is time you see in RSMO.
Let's take a example:
In Process chain A- there is step LOAD DATA- from ODS A----> ODS B,ODS C,Cube A.
When I right click on the display messages for successful load it shows all the messages like
Job started.....
Job ended.....
The total time here it shows 15 minutes.
when I go to RSMO for same step it shows 30 mintues....
I am confused....
Please help me??? -
Master Data loading got failed: error "Update mode R is not supported by th
Hello Experts,
I use to load master data for 0Customer_Attr though daily process chain, it was running successfully.
For last 2 days master data loading for 0Customer_Attr got failed and it gives following error message:
"Update mode R is not supported by the extraction API"
Can anyone tell me what is that error for? how to resolve this issue?
Regards,
NiravHi
Update mode R error will come in the below case
You are running a delta (for master data) which afils due to some error. to resolve that error, you make the load red and try to repeat the load.
This time the load will fail with update mode R.
As repeat delta is not supported.
So, now, the only thing you can do is to reinit the delta(as told in above posts) and then you can proceed. The earlier problem has nothing to do with update mode R.
example your fiorst delta failed with replication issue.
only replicating and repeaing will not solve the update mode R.
you will have to do both replication of the data source and re-int for the update mode R.
One more thing I would like to add is.
If the the delat which failed with error the first time(not update mode R), then
you have to do init with data transfer
if it failed without picking any records,
then do init without data transfer.
Hope this helps
Regards
Shilpa
Edited by: Shilpa Vinayak on Oct 14, 2008 12:48 PM -
CALL_FUNCTION_CONFLICT_TYPE Standard Data loading
Hi,
I am facing a data loading problem using Business content on CPS_DATE infocube (0PS_DAT_MLS datasource).
The R/3 extraction processes without any error, but the problem occurs in the update rules while updating the milestone date. Please find hereunder the log from the ST22.
The real weird thing is that the process works perfectly in development environment and not in integration one (the patch levels are strongly the same: BW 3.5 Patch #16).
I apologise for the long message below... this is a part of the system log.
For information the routine_0004 is a standard one.
Thanks a lot in advanced!
Cheers.
CALL_FUNCTION_CONFLICT_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
Symptoms. Type conflict when calling a function module
Causes Error in ABAP application program.
The current ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" had to be terminated because one of the statements could not be executed.
This is probably due to an error in the ABAP program.
A function module was called incorrectly.
Errors analysis
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_DYN_CALL_ILLEGAL_TYPE', was neither caught nor passed along using a RAISING clause, in the procedure "ROUTINE_0004"
"(FORM)" .
Since the caller of the procedure could not have expected this exception
to occur, the running program was terminated. The reason for the exception is:
The call to the function module "RS_BCT_TIMCONV_PS_CONV" is incorrect:
The function module interface allows you to specify only fields of a particular type under "E_FISCPER".
The field "RESULT" specified here is a different field type.
How to correct the error.
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
"CALL_FUNCTION_CONFLICT_TYPE" CX_SY_DYN_CALL_ILLEGAL_TYPEC
"GP420EQ35FHFOCVEBCR6RWPVQBR" or "GP420EQ35FHFOCVEBCR6RWPVQBR"
"ROUTINE_0004"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)". 2. A suitable printout of the system log To obtain this, call the system log through transaction SM21. Limit the time interval to 10 minutes before and 5 minutes after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs, supply the source code.
To do this, select the Editor function "Further Utilities-> Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
The exception must either be prevented, caught within the procedure
"ROUTINE_0004"
"(FORM)", or declared in the procedure's RAISING clause.
To prevent the exception, note the following:
Environment system SAP Release.............. "640"
Operating system......... "SunOS" Release.................. "5.9"
Hardware type............ "sun4u"
Character length......... 8 Bits
Pointer length........... 64 Bits
Work process number...... 2
Short dump setting....... "full"
Database type............ "ORACLE"
Database name............ "BWI"
Database owner........... "SAPTB1"
Character set............ "fr"
SAP kernel............... "640"
Created on............... "Jan 15 2006 21:42:36" Created in............... "SunOS 5.8 Generic_108528-16 sun4u"
Database version......... "OCI_920 "
Patch level.............. "109"
Patch text............... " "
Supported environment....
Database................. "ORACLE 9.2.0.., ORACLE 10.1.0.., ORACLE 10.2.0.."
SAP database version..... "640"
Operating system......... "SunOS 5.8, SunOS 5.9, SunOS 5.10"
SAP Release.............. "640"
The termination occurred in the ABAP program "GP420EQ35FHFOCVEBCR6RWPVQBR" in
"ROUTINE_0004".
The main program was "RSMO1_RSM2 ".
The termination occurred in line 702 of the source code of the (Include)
program "GP420EQ35FHFOCVEBCR6RWPVQBR"
of the source code of program "GP420EQ35FHFOCVEBCR6RWPVQBR" (when calling the editor 7020).
Processing was terminated because the exception "CX_SY_DYN_CALL_ILLEGAL_TYPE" occurred in the procedure "ROUTINE_0004" "(FORM)" but was not handled locally, not declared in the RAISING clause of the procedure.
The procedure is in the program "GP420EQ35FHFOCVEBCR6RWPVQBR ". Its source code starts in line 685 of the (Include) program "GP420EQ35FHFOCVEBCR6RWPVQBR ".
672 'ROUTINE_0003' g_s_is-recno
673 rs_c_false rs_c_false g_s_is-recno
674 changing c_abort.
675 catch cx_foev_error_in_function.
676 perform error_message using 'RSAU' 'E' '510'
677 'ROUTINE_0003' g_s_is-recno
678 rs_c_false rs_c_false g_s_is-recno
679 changing c_abort.
680 endtry.
681 endform.
682 ************************************************************************
683 * routine no.: 0004
684 ************************************************************************
685 form routine_0004
686 changing
687 result type g_s_hashed_cube-FISCPER3
688 returncode like sy-subrc
689 c_t_idocstate type rsarr_t_idocstate
690 c_subrc like sy-subrc
691 c_abort like sy-subrc. "#EC *
692 data:
693 l_t_rsmondata like rsmonview occurs 0 with header line. "#EC *
694
695 try.
696 * init
variables
697 move-corresponding g_s_is to comm_structure.
698
699 * fill the internal table "MONITOR", to make monitor entries
700
701 * result value of the routine
>>>> CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
703 EXPORTING
704 I_TIMNM_FROM = '0CALDAY'
705 I_TIMNM_TO = '0FISCPER'
706 I_TIMVL = COMM_STRUCTURE-CALDAY
707 I_FISCVARNT = gd_fiscvarnt
708 IMPORTING
709 E_FISCPER = RESULT.
710 * if the returncode is not equal zero, the result will not be updated
711 RETURNCODE = 0.
712 * if abort is not equal zero, the update process will be canceled
713 ABORT = 0.
714
715 catch cx_sy_conversion_error
716 cx_sy_arithmetic_error.
717 perform error_message using 'RSAU' 'E' '507'
718 'ROUTINE_0004' g_s_is-recno
719 rs_c_false rs_c_false g_s_is-recno
720 changing c_abort.
721 catch cx_foev_error_in_function.
System zones content
Name Val.
SY-SUBRC 0
SY-INDEX 2
SY-TABIX 0
SY-DBCNT 0
SY-FDPOS 65
SY-LSIND 0
SY-PAGNO 0
SY-LINNO 1
SY-COLNO 1
SY-PFKEY 0400
SY-UCOMM OK
SY-TITLE Moniteur - Atelier d'administration
SY-MSGTY E
SY-MSGID RSAU
SY-MSGNO 583
SY-MSGV1 BATVC 0000000000
SY-MSGV2 0PROJECT
SY-MSGV3
SY-MSGV4
Selected variables
Nº 23 Tpe FORM
Name ROUTINE_0004
GD_FISCVARNT
22
00 RS_C_INFO I
4
9
COMM_STRUCTURE-CALDAY
20060303
33333333
20060303
SYST-REPID GP420EQ35FHFOCVEBCR6RWPVQBR 4533345334444454445355555452222222222222 704205135686F365232627061220000000000000
RESULT
000
333
00You have an update routine in which youar callin FM 'RS_BCT_TIMCONV_PS_CONV'. Parameter e_fiscper must be the same that type of the variable you use (you can see the data tyoe in FM definition, transaction se37). You should do somethin like the following.
DATA: var type <the same that e_fiscper in FM definition>
CALL FUNCTION 'RS_BCT_TIMCONV_PS_CONV'
EXPORTING
I_TIMNM_FROM = '0CALDAY'
I_TIMNM_TO = '0FISCPER'
I_TIMVL = COMM_STRUCTURE-CALDAY
I_FISCVARNT = gd_fiscvarnt
IMPORTING
E_FISCPER = var.
result = var.
--- ASSIGN POINTS IS USEFUL. -
Data load stuck from DSO to Master data Infoobject
Hello Experts,
We have this issue where data load is stuck between a DSO and master data infoobject
Data uploads from DSO( std) to master data infoobject.
This Infoobject has display and nav attributes in it which are mapped from DSO to Infoobject.
Now we have added a new infoobject as attribute to the master data infoobject and made it as NAV attri.
Now when we are doing full load via DTP the load is stuck and is not processing.
Earlier it took only 5 mns of time to complete the full load.
Please advise what could be the reason and cause behind this.
Regards,
santhosh.Hello guys,
Thanks for the quick response.
But its nothing proceeding further.
The request is still running.
earlier this same data is loaded in 5 mns.
Please find the screen shot.
master data for the infoobjects are loaded as well.
I can see in SM50 the process at P table of the infoobject the process is.
Please advise.
Please find the detials
Updating attributes for InfoObject YCVGUID
Start of Master Data Update
Check Duplicate Key Values
Check Data Values
Process time dependent attributes- green.
No Message: Process Time-Dependent Attributes- yellow
No Message: Generates Navigation Data- yellow
No Message: Update Master Data Attributes - yellow
No Message: End of Master Data Update - yellow
and nothing is going further in Sm37
Thanks,
Santhosh. -
How to create a report in bex based on last data loaded in cube?
I have to create a query with predefined filter based upon "Latest SAP date" i.e. user only want to see the very latest situation from the last load. The report should only show the latest inventory stock situation from the last load. As I'm new to Bex not able to find the way how to achieve this. Is there any time characteristic which hold the last update date of a cube? Please help and suggest how to achieve the same.
Thanks in advance.Hi Rajesh.
Thnx fr ur suggestion.
My requirement is little different. I build the query based upon a multiprovider. And I want to see the latest record in the report based upon only the latest date(not sys date) when data load to the cube last. This date (when the cube last loaded with the data) is not populating from any data source. I guess I have to add "0TCT_VC11" cube to my multiprovider to fetch the date when my cube is last loaded with data. Please correct me if I'm wrong.
Thanx in advance. -
Data load from DSO to cube fails
Hi Gurus,
The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
The DSO has been loaded without errors from ECC.
Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
I looked in the PSA has about 50, 000 records .
and all data packages have green light and all amounts have 0currency assigned
I went in to the DTP and looked at the error stack it has nothing in it then I changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
The ZARAMT filed has 0currency blank for all these records
I tried to assign USD to them and the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
What should I do to resolve the issue.
thanks
PrasadHi Prasad....
Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
Actually....in this case what you are suppose to do is :
1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
2) Then correct the erroneos records in the Error Stack..
3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
4) Then manually change the status of the original request to Green....
But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
Regards,
Debjani...... -
Hi,
I have a question,
In book TBW10 i read about the data load from DSO to InfoCube
" We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
Please some one explaine me.
ThanksNo, it will not be 40.
It ll be 30 only.
Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
so it ll be like this 10-10+30 = 30.
Thank-You.
Regards,
Vinod -
Data load from DSO to Cube in BI7?
Hi All,
We just migrated a dataflow from 3.5 to 7 in development and moved to production. So till now in production, the dataloads happend using the infopackages.
a. Infopackage1 from datasource to ODS and
b. Infopackage2 from ODS to the CUBE.
Now after we transported the migrated dataflow to production, to load the same infoproviders I use
1. Infopackage to load PSA.
2. DTP1 to load from PSA to DSO.
3. DTP2 to load from DSO to CUBE.
step1 and step2 works fine but when I run the DTP2 it is getting terminated. But now when I tried the step b (above), it loads the CUBE fine using the infopackage. So I am unable to understand why the DTP failed and why the infopackage load is successful. In order to use the DTP do we need to do any cleanup when using it for first time? Please let me know if you have any suggestions.
Please note that the DSO already has data loaded using infopackage. (Is this causing the problem?)
Thanks,
Sirish.Hi Naveen,
Thanks for the Reply. The creation of DTP is not possible without a transformation.
The transformation has been moved to production successfully. -
Hello friends
I am facing a problem in the data load. I modified a cube by adding few Characteristics. The characteristics were first added in the communication structure and in the transfer rules. Then I reactivated the update routine. Finally, I deleted any previous data load request for the cube and did a full load. However i wasn't able to find any data for the newly added fields in the Cube.
Did I miss something. Any help will be appreciated in this regard.
Thanks
Rishihow come ODS came in to picture,this was not mentioned in ur prev post,are u loading it from ODS to CUbe and having problems??
looks u are not using DTP ,in that case check the change log for the newly added fields and then have data flow as ODS>PSA>Cube and check the PSA if those fields are present ..if yes check for update rules in debugging mode of those gets deleted
Hope it Helps
Chetan
@CP.. -
Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 [X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures]
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
Thanks.Hi guys...
Suppose I have two Datasources that are mapped to a infosource and this infosource is mapped to one dso(all objects until DSO are emulated from 3.x to 7.x)...when I load data,I assume that I have to use two infopackages and I get data into DSO in two requests.I have few questions about this,assuming I have only these two requests in my DSO:
1.When I tried to create a query directly on DSO in query designer... I couldnot find the infoobject 0REQUESTID in query designer...then how can I do if I want to see data request by request rather than all together?
Request-ID is only a part of the new data table - after activation of your data your request will get lost. If you want to see whats happening, load you data request by request and activate your data after each request
2.Suppose the DSO gets data like below:
Fields in DSO:X1,X2,Y1,Y2,Y3 X1,X2 are characteristics and also keys,Y1,Y2,Y3 are keyfigures
Data feeded by Datasource 1 : X1 X2 Y1
a b 10
Data feeded by Datasource 2 : X1 X2 Y2 Y3
a b 20 30
so when I load data,I will load data in two requests and these are the only two requests I have in my DSO....then how will data look in DSO.....does it gets stored in two seperate rows or single row?how is it shown in a query result?
If the keys are equal, you will have only one dataset in your DSO
If the keys are not matched,how will the data be shown for keyfigures that are not loaded by that request?
Then you will have two datasets in your DSO
3.I know that in DSO,We have two options:Overwrite/Addition....how will be the data loading be in following situation:
Datasource 1 feeds like this in Request 1:
X1 X2 Y1
a b 10
Datasource 2 feeds like this in Request 2:
X1 X2 Y1 Y2 Y3
a b 30 40 50
how will the result be shown in our two options Addition and Overwrite?will request 2 overwrite or add up data in Y1?
If you choose overwrite, you will get 30 - if you choose addition, you will get 40
Thanks.
Maybe you are looking for
-
Can you block an iPad if you lose it, Can you block an iPad if you lose it?
Hi, need your advise, If you lose your ipad2, is there a way that you can block the function so that whoever finds it won't be able to use it?
-
Hello, How can I class of files of given type with a specific application? For example I have both Pages and OpenOffice installed and by default I want any .doc file to open in Pages unless I manually say otherwise. I've seen forum posts about Open
-
I have always been able to import pictures from camera to iphoto without a problem. Now, for some unknown reason our iMac and Mac Laptop will not import pictures from camera to iphoto 9 or 11...what is the fix to this problem? It will only accept i
-
Java 5 (1.5) no need for stubs?!! (codebase)
I am trying to run my program under java 1.5, without compiling with rmic!! I still have get this exception when I try to export the server using UnicastRemoteObject.exportObject(this); Stub class not found!!!! If the answer is using codebase, how ca
-
Folders & Groups Seperated to Individual Functions
Is there a way to get Folders To Function Differently from Groups? I want Folders for Organization of my Layers Palette only... and I want Groups to function as attribute grouping like they do now.. where if you select the group it selects them all,