Error when loading data from DSO to Cube
Hi,
After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
Has anyone experienced a similar problem or have some guidelines on how to resolve it?
Kind regards,
Fredrik
Hej Martin,
Tack!
Problem solved
//Fredrik
Similar Messages
-
Error while loading data from DSO to Cube
Dear Experts,
I have change update routines of the update rule for DSO & update rule between DSO & cube. when i uploading data in DSO , working fine. But when uploading from DSO to cube , it giving error.
Error Message :
DataSource 8ZDSO_ABB does not have the same status as the source system in the Business Information Warehouse.
The time stamp in the source system is 30.09.2010 14:05:17.
The time stamp in the BW system is 09.05.2008 16:41:10.
I executed program RS_TRANSTRU_ACTIVATE_ALL , still giving same error.
Regards,
Anand Mehrotra.Dear Experts,
Thanks for help.Problem solved.
Regards,
Anand Mehrotra. -
Dump when loading Data from DSO to Cube
Hi,
i get a short dump DBIF_REPO_PART_NOT_FOUND when trying to load data from a dso to a cube. From DSO to DSO it works fine.
any idea`s?
thx!
Dominikthe Dump occurs in the last step of the last data pakage in the DTP Monitor.
I checked the OSS note, but i don´t know how to check the kernel version.
Now i used another DSO for testing and i got another Dump with a different name.
After that i logged in again and it works with my test DSO data...
But still not working with the original one...
@Kazmi: What detaisl u mean exactly? -
Error in loading data from DSO to Cube
Hi
Daily process chains are executed and the data is loaded from DSO to several cubes.
On 7.11.2008, there was an error in one of the cube. When checked in the short dump, the following is the dump msg:
DBIF_REPO_PART_NOT_FOUND. I looked into the note 854261 and had a chat with our basis guy, he said our kernal and patch levels are more then that mentioned in the note.
Strange thing is, from 8.11.2008 to till date, the loads are running fine but the problem was only on that date.
So now i have several request in the cube with loads of data but "WITHOUT" reporting. Because of error on 7.11.2008.
Can anyone help in solving the issue. I want to re-execute the load again and see if the problem occurs again or not.
Regards
AnnieHi,
Check the reconstruction tab. If the request is green, you could try deleteing and reconstructing the incorrect request.
Regards. -
DTP load data from DSO to CUBE can't debug
Hi,all expert.
I met a problem yestoday.
When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
inconsistent input parameter(parameter:<unknow>,value <unknown>)
message no: RS_EXCEPTION 101.
I don't know why,PLZ help me,thank you very much.Hi,
An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
Thanks,
Arminder -
Unable to load data from DSO to Cube
Good morning all,
I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
Thank you so much!Hi BI User,
For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
So..first do the initialization and perform delta upload into the cube.
Regards,
Subhashini. -
ERROR : "Exceptions in Substep: Rules" while loading data from DSO to CUBE
Hi All,
I have a scenario like this.
I was loading data from ODS to cube daily. everything was working fine till 2 days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error "Exceptions in Substep: Rules" after some package of data loaded successfully.
I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again "Exceptions in Substep: Rules".
Can any one tell me how to come out from this situation ?
Regards,
Komik ShahDear,
Check some related SDN posts:
zDSo Loading Problem
Re: Regarding Data load in cube
Re: Regarding Data load in cube
Re: Error : Exceptions in Substep: Rules
Regards,
Syed Hussain. -
Job cancelled While loading data from DSO to CUBE using DTP
Hi All,
while i am loading data frm DSO to CUBE the job load is getting cancelled.
in the job overview i got the following messages
SYST: Date 00/01/0000 not expected.
Job cancelled after system exception ERROR_MESSAGE
wt can be the reason for this error as i have successfully loaded data into 2 layers of DSO before loading the data into the CUBEhi,
Are u loading the flat file to DSO?
then check the data in the PSA in the date field and replace with the correct date.
i think u r using write optimised DSO which is similar to PSA it will take all the data from PSA to DSO.
so clear the data in PSA and then load to DSO and then to CUBE.
If u dont need a date field in CUBE then remove the mapping in transformation in cube and activate and make DTP activate and trigger it it will work. -
Error while loading data from ODS to CUBE.
Hi friends,
When l am loading data from ODS to Cube with help of data mart, I am getting error in QA system , IN DM system ,every thing went well.IF i see the detail tab in monitor under Processing .
Its is showing like this .
Transfer Rules : Missing Massage.
Update PSA : missing massage.
Processing end : missing message.
I have checked the coding in update rules, everything is ok.
Plz any inputs.
hari
Message was edited by:
hari reddyMight means that IDocs flow is not defined properly in your QA system for myself-SourceSystem.
Regards,
Vitaliy -
Error while loading data from DS into cube
Hello All
I am getting the below error while loading data from Data source to the cube.
Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
Wht am i suppose to do?
Need ur input in this regard.
Regards
RohitHi
Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
If your business is April to March make it as 'V3'
If your business is January to Decemeber make it as 'K4'..
activate your update rules delete old data and upload again....
Hope it helps
Thanks
Teja -
Error while loading data from DSO to Infocube
Hi all,
I'm loading data from flat file to the cube via a DSO. I get an error in the DTP.
The data loads to the DSO, i'm able to see the contents in the active data table.
Transformation from DSO to Cube is activated without any errors. Its a one to one mapping. No routines used.
When i execute the DTP ( i use full load option as it is a one time load) i get the following error in DTP: All of which are Red.
Process Request
Data Package 1: Error during processing
Extract from DSO XXXXX
Processing Terminated
The long text of the error gives the msg: Data package 1: Processed with errors RSBK257
ThanksHi,
You can check the below forums.
DTP with error RSBK257
- Jaimin -
While loading data from DSO TO CUBE.(which Extraction option)
hi friends,
my questin is that when we load the data from one data target to another data target then we have four options to extract the data in DTP screen from (1.active data table (with archive) 2.active data table (without archive) 3.change log table. and fourth i dont remember.so my question is when we update data like from DSO TO CUBE. Then which option we should choose. 2nd or 3nd.?
<removed_by_moderator>
Edited by: Julius Bussche on Jan 4, 2012 9:26 AM- If you want to do a full load always select 2nd option i.e. Active Data Table (without archive).
- If you wand to do a delta load, you can select either of 2 or 3
Active Data Table (without archive) - first load will be full from active table and after that it will be delta from change log
(ii) Changelog table - you will have to do a full load using a full DTP and then delta can be done using this DTP (but you will need to an init as well using this DTP)
- if you have done using a full DTP and now just want to run delta, select 3rd option and do an init before running delta
I hope it helps.
Regards,
Gaurav -
Semantic Partitioning Delta issue while load data from DSO to Cube -BW 7.3
Hi All,
We have created the semantic partion with the help of BADI to perform Partitions. Everthing looks good,
first time i have loaded the data it is full update.
Second time i init the data load. I pulled the delta from DSO to Cube . DSO is standard where as Cube is partition with the help of semantic partition. What i can see is the records which are updated in Latest delta are shown into the report rest all are ignored by the system.
I tried compression but still it did not worked.
Can some has face this kind
ThanksYaseen & Soujanya,
It is very hard to guess the problem with the amount of information you have provided.
- What do you mean by cube is not being loaded? No records are extracted from DSO? Is the load completing with 0 records?
- How is data loaded from DSO to InfoCube? Full? Are you first loading to PSA or not?
- Is there data already in the InfoCube?
- Is there change log data for DSO or did someone delete all the PSA data?
Sincere there are so many reasons for the behavior you are witnessing, your best option is to approach your resident expert.
Good luck.
Sudhi Karkada
<a href="http://main.nationalmssociety.org/site/TR/Bike/TXHBikeEvents?px=5888378&pg=personal&fr_id=10222">Biking for MS Relief</a> -
ODI Error when Loading data from a .csv file to Planning
Hello,
I am trying to load data from a csv file to planning using ODI 10.1.3.6 and I am facing this particular error. I am using staging area as Sunopsis memory engine.
7000 : null : java.sql.SQLException: Invalid COL ALIAS "DEFAULT C12_ALIAS__DEFAULT" for column "ALIAS:"
java.sql.SQLException: Invalid COL ALIAS "DEFAULT C12_ALIAS__DEFAULT" for column "ALIAS:"
at com.sunopsis.jdbc.driver.file.bb.b(bb.java)
at com.sunopsis.jdbc.driver.file.bb.a(bb.java)
at com.sunopsis.jdbc.driver.file.w.b(w.java)
at com.sunopsis.jdbc.driver.file.w.executeQuery(w.java)
at com.sunopsis.sql.SnpsQuery.executeQuery(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.k(e.java)
at com.sunopsis.dwg.cmd.g.A(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
Code from Operator:
select Account C1_ACCOUNT,
Parent C2_PARENT,
Alias: Default C12_ALIAS__DEFAULT,
Data Storage C3_DATA_STORAGE,
Two Pass Calculation C9_TWO_PASS_CALCULATION,
Account Type C6_ACCOUNT_TYPE,
Time Balance C14_TIME_BALANCE,
Data Type C5_DATA_TYPE,
Variance Reporting C10_VARIANCE_REPORTING,
Source Plan Type C13_SOURCE_PLAN_TYPE,
Plan Type (FinStmt) C7_PLAN_TYPE__FINSTMT_,
Aggregation (FinStmt) C8_AGGREGATION__FINSTMT_,
Plan Type (WFP) C15_PLAN_TYPE__WFP_,
Aggregation (WFP) C4_AGGREGATION__WFP_,
Formula C11_FORMULA
from TABLE
/*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=Account.csvSNP$CRLOAD_FILE=Y:/1 Metadata/Account//Account.csvSNP$CRFILE_FORMAT=DSNP$CRFILE_SEP_FIELD=2CSNP$CRFILE_SEP_LINE=0D0ASNP$CRFILE_FIRST_ROW=1SNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=AccountSNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=ParentSNP$CRTYPE_NAME=STRINGSNP$CRORDER=2SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Alias: DefaultSNP$CRTYPE_NAME=STRINGSNP$CRORDER=3SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Data StorageSNP$CRTYPE_NAME=STRINGSNP$CRORDER=4SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Two Pass CalculationSNP$CRTYPE_NAME=STRINGSNP$CRORDER=5SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Account TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=6SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Time BalanceSNP$CRTYPE_NAME=STRINGSNP$CRORDER=7SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Data TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=8SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Variance ReportingSNP$CRTYPE_NAME=STRINGSNP$CRORDER=9SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Source Plan TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=10SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Plan Type (FinStmt)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=11SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Aggregation (FinStmt)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=12SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Plan Type (WFP)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=13SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Aggregation (WFP)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=14SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=FormulaSNP$CRTYPE_NAME=STRINGSNP$CRORDER=15SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CR$$SNPS_END_KEY*/
insert into "C$_0Account"
C1_ACCOUNT,
C2_PARENT,
C12_ALIAS__DEFAULT,
C3_DATA_STORAGE,
C9_TWO_PASS_CALCULATION,
C6_ACCOUNT_TYPE,
C14_TIME_BALANCE,
C5_DATA_TYPE,
C10_VARIANCE_REPORTING,
C13_SOURCE_PLAN_TYPE,
C7_PLAN_TYPE__FINSTMT_,
C8_AGGREGATION__FINSTMT_,
C15_PLAN_TYPE__WFP_,
C4_AGGREGATION__WFP_,
C11_FORMULA
values
:C1_ACCOUNT,
:C2_PARENT,
:C12_ALIAS__DEFAULT,
:C3_DATA_STORAGE,
:C9_TWO_PASS_CALCULATION,
:C6_ACCOUNT_TYPE,
:C14_TIME_BALANCE,
:C5_DATA_TYPE,
:C10_VARIANCE_REPORTING,
:C13_SOURCE_PLAN_TYPE,
:C7_PLAN_TYPE__FINSTMT_,
:C8_AGGREGATION__FINSTMT_,
:C15_PLAN_TYPE__WFP_,
:C4_AGGREGATION__WFP_,
:C11_FORMULA
Thanks in advance!Right-clicking "data" on the model tab can you see the data?
In your code there's written:
P$CRLOAD_FILE=Y:/1 Metadata/Account//Account.csv
Is it right the double slash before the file name? -
Dtp error when loading data from one cube to another cube?
hi,experts
it seems strange that when I tick the some characteristics as navigational attributes in the cube , the error occurs during the execution of the DTP:error while updating to target Z* (Cube name ).
once i turn the flag off , no error appear. could anyone give me a clue ?
thanks in advance!Hi,
When u make changes in the cube u need to make necessary changes in Transformation and activate the DTP.The checking of Navigational attributes in cube will appear in u r transformation as new fields where u need to create a mapping for them and then activate, also activate your DTP and then load the data. This should resolve your issue.
Regards
Maybe you are looking for
-
Just created a video - But can't see it - Tracks appear unchecked !
Help please ! Just put a few .mov files together in Quick Time Pro Version 7 downloaded a couple of days ago. (Save as one file not saved to collect from separate files). Looked good. Saved it. Then went to view again. Opened file - just get a strip
-
Setting up air printer on non-supported printer using vista os
how to set up air printer on non-supported printer using vista os
-
I have a MP (2009, 3.3 Nehalem Quad and 16GB RAM) and wanted to improve performance in APERTURE (see clock wheel processing all the time) with edits, also using Lightroom, and sometimes CS5. Anyone with experience that can say upgrading from the GT1
-
Nikon D200 vs Nikon D100 crashes iPhoto
Does anyone have any idea why iPhoto would crash when saving Photoshop edited photos downloaded from my brand new Nikon D200. It doesn't happen with any of the photos in iPhoto that were downloaded with my Nikon D100. I made sure all available Apple
-
Network and antenna questions about iPhone 5
My iPhone 4, on ATT's network, has very poor connectivity. Surely verizon must be better. I'm thinking of upgrading and switching, and I'm also wondering if the antenna on the iPhone 5 does any better.