Data Load with warnings, but dataload.err empty
Hi guys,
laoding data on EAS Console for a Essbase database gives below message:
Database has been cleared before loading.
dataload.err is EMPTY.
Any ideas??
Some data is loaded but how can I find out where it stopped or is all my data loaded?
Message:
Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
Transaction [ 0x190004( 0x4fd89600.0x64578 ) ] aborted due to status [1003050].
There were errors, look in H:\My Documents\temp\dataload.err
Database import completed ['AXN_Sim'.'BewMod']
Output columns prepared: [0]
Thanks Bernd
Hi,
There is nothing much you can say whether the data is completely loaded or not. You have to check by yourself on that.
Since the loading was aborted with a warning, it could be a network issue..
Try to load a small file and then check whether you are facing the problem or you coulfd also post your finding on that..
Thanks,
CM
Similar Messages
-
PL/SQL will compile with warnings but will not run
I have a pl/sql package that will compile with warnings but no errors but when i try to run it I get an ORA-6508 error telling me there is an error somewhere in the package but I can't find it.
I have two user schemas that use 2 slightly different versions of the package. I had to make slight code changes to one of the versions and when I made those changes it stopped running. The second version compiles and runs correctly.
After going through a line by line comparison the first copy was still not running. I copied the second version of the code into the broken schema and commented out the additional lines of code that are not needed in this version.
When I tried to compile and run this version it still fails with the same error.
I am using Oracle XE and the databases are small.
I can send on the code if necessary.
Can anyone point me in the right direction?
Thanks
SusanI tried doing what you suggested but there are no errors as it is compiling correctly.
It is only when I run the package through the debugger that I get the error. I have posted it below
Connecting to the database Hess S3.
Executing PL/SQL: ALTER SESSION SET PLSQL_DEBUG=TRUE
Executing PL/SQL: CALL DBMS_DEBUG_JDWP.CONNECT_TCP( '127.0.0.1', '2086' )
Debugger accepted connection from database on port 2086.
Processing 59 classes that have already been prepared...
Finished processing prepared classes.
Exception breakpoint occurred at line 10 of BnmSkOTQ3jz5I52ZOxC4QNw.pls.
$Oracle.EXCEPTION_ORA_6508:
ORA-04063: package body "SHIPPING.LIFTINGSCHEDULE" has errors
ORA-06508: PL/SQL: could not find program unit being called: "SHIPPING.LIFTINGSCHEDULE"
ORA-06512: at line 10
Process exited.
Disconnecting from the database Hess S3.
Debugger disconnected from database.
Regards
susan -
EDT Cat. 15 Data loaded to SAP but failed when updating
Hi,
I have an structure to load data into BP using KCLJ with Cat 15.
Here it is.
AKTYP
TYPE
PARTNER
ROLE1
KUNNR
KUNNR_EXT
BU_GROUP
FIBUKRS
CHIND_ADDR
NAME_CO
NAME_ORG1
NAME_ORG2
STREET
STR_SUPPL1
STR_SUPPL2
STR_SUPPL3
LOCATION
HOUSE_NUM1
POST_CODE1
CITY1
COUNTRY
REGION
LANGU
CHIND_TEL
TEL_NUMBER
TEL_EXTENS
We had the data loaded to SAP with ROLE 000000 and TR0100 but somehow when we tried to update the address thru ROLE TR0100, we had a dump and failed to update. However, the update is good when we use ROLE 000000 to update.
Anyone would have a clue of this?
ThanksI am new to EDT also. I know Cat 15 which can load BP. SAP has more categories which can load more using KCLJ. Here's my experience.
Under Tcode SIMGH, find IMG structure External Data Transfer for SAP Banking. Under that structure, you can find Display Required and Optional Entry Fields for SEM Banking. When you enter 15 in the Category box, you'll see a whole list of fields about BP. You can use these fields to create your own structure.
After you created your structure, use Define Sender Structure under External Data Transfer for SAP Banking to define your structure. After that, it is done. You can try use KCLJ to load your BP.
If you still have other issues, mostly will be the configuration.
Enjoy. -
Data loaded into ODS, but no activation possible
Hello Experts,
I've data loaded data into an ODS, but no activation is possible!
In Monitor it just say something like "not activated yet".
usually it only takes a minute to activate, so I'm confised.
any suggestions?
Thanx
AxelHi,
In SM37 -> give Job Name as BI_ODSA* and USer as * and check is there any job with Active status. If so, We can say ODS Activation is on process.
With rgds,
Anil Kumar Sharma .P -
Transction data loading with master data as key?
Expert's,
I am trying to load the transction data with master data as a key.ZEMPID is my masterdata which having attributes ZEMPNAM and ZEMPDES .My transction data contains ZEMPID.ZEMPSAL and ZEMPAGE.I could create the master data and also load it successfully.But when I tried to create the infosource for trasctional data I should get all the attributes whatever I have given it for the ZEMPID.This I am not getting.Also While creating the cube if I select the infosource then in chraracterstic tab I am not getting my characterstics(master data+attribute).Please any one help me.Hello Vasu
You are talkin abt transactional data and saying that not created cube/Ods ..
Let me put the things again
1. When you load master data you create infosource ( direct or indirect), All the attribute will be automatically comes in the infosource...when you define application component in master data tab in infoobject maintenance it is ready for data to be loaded. Search for application component in infosource..u will get the flow
2. When you load data in infocube only transactional data goes in it.....master data presents in master data tables...In infosource you define what are the field which need to be present in cube.......so you will be able to include only these objects which are present in cube dimensions + attribute which are defined navigational in the cube....
Master data attribute will never be pushed automatically....only componding object are pushed automatically.....
If you want to load attribute of ZEMPID then make them navigational in cube or add them in one of cubes dimension...
Hope i could make myself clear
Thanks
Tripple k -
Material data load with two unit (0MAT_UNIT) failed?
Hi,
I am loading the 0MAT_UNIT data from R/3 QAS to BW QA, I am getting below error.
0MAT_UNIT : Data record 34 ('000000000000010249 ') : Duplicate data record RSDMD 191
There two records for same material number with two different units (EA and PAK), Data loaded to BW dev successfully even if duplicate records for material with two units but it failed in BW QA.
Any setting I am missing?
Please help.
Thanks,
SteveIf you look at the definition of the P table (/BI0/PMAT_UNIT) you will find that both material and unit are keys. So the system recognizes a record as duplicate only when a given material with a given unit exists twice in the data. Based on what you are saying, this is not the case. You have two separate units for a given material. Can you please check to see if there is another record for the same material in a different datapacket ??
-
Data Load with HAL Problem with Decimal-delimiter/Language Setting
Hi,
we use HAL to load Data into Essbase. The Data is with a . as Decimal Point like in
1252.25 When i upload these Data with HAL the number becomes 125225.00
The Locale of my OS is German
The Locale i specified for the file in HAL is English
If i change the Locale of my BS to English the Problem disappears but thats annoying.
Has anybody else such a problem ?
Is There a Solution for this ?
Thanks.
KevinReading over John's blog, we created a rule file.
But, it looks the load process using the rule file seems to be hanging? are we missing any thing? We are using comman seperator and having number of lines to skip as zero. -
Data loading with 1:N & M:N relationship
Hi,
Which data (master or transaction) we can load with 1 : N relationship and M : N relationship.
Can anyone give me the idea.
ThxIn case of master data,the char infoobjects with master data attributes take care of 1:N relation.For e.g for material master data,the material number would be the key based on which the data would be loaded for each new material.In case of transaction data for 1:N relation,you may take up DSO in which the primary keys can be made the key fields of DSO based on which the repeating combinations would be overwritten.You may use an infoset when there is a requirement of inner join(intersection) or left outer join between master data/DSO objects. The infocube can be used for M:N relation.For union operation betwen two infoproviders,multiprovider may be used.
Hope this helps you out.
Regards,
Rinal -
Data loading with routine displays zero for key fig values
Hi all,
my source field is amount...........and target field is research amount...........
if i restrict amount(source field) with cost element(coaeom7) and value type(010 then it is equal to target field research amount.
for this my code is
IF COMM_STRUCTURE-VTYPE = '010' AND
COMM_STRUCTURE-COSTELMNT = 'COAEOM12'.
RESULT = COMM_STRUCTURE-AMOUNT.
ENDIF.
but when i load the data it displays only zeros.........
Pleas suggest
Regards,
Raj.Hi Raj,
Do u need costelement values other than 'COAEOM12' into the target.
Are you writing this routine in Start Routine/End Routine. if its BI.7
If you are loading data from soucre cube/DSO to target structure, in the transformation you need to write start routine, before which you need to map amount field, cost element,and value type from source to Research amount in target,
The you need to write the code
If source_packake cost element = COAEOM12' and source_packake value type =010
then research amount = amount
End If.
i hope this will solve your problem. -
Error in Data loading with ODS
Hi Guru's
I'm loading the data from the Flat file to the ODS
the key figure net masss volume with the data type quantity show s the value multiply thousand but it in the psa it shows correct value what is there in the source system
for eg: value for this key figure 48.984 in the flat file means 48.(point) 984
but it is showing in the ods 48,984(comma) so some where it is multiplying by 1000 like it is applying coumn level
and there is no transfer rules it is direct mapping
there is no routines
please let me know where is the problem
Points will be assignHi Sanjay,
It is not multiplied by 1000. The settings are like that. The decimal seperator used here is ',' instead of '.' .
Steps to change the settings:
Goto Menu-> system->user Profile->own data->in the following screen of user profile maintanance, goto tab Defaults. There you can find a drop down for Decimal notations
From there you can change the settings. You can also use the T.code SU01.
After changing the settings logoff and then login for the changes to take place.
Hope this helps you.
Regards,
Yokesh. -
Data-load with Negetive values
Hi All,
5 People viewed my query..nobody answered.
it is a simpe query which i am not supposed to post ?
if not pls respond ...or pls ignore it....
Pls suugest me onthis.
In my app data file contains Positive values.
Only for few members i have to load the data as --negative values.
EX;Count Type Code D for Debit C for Credit
If D, load Contract & Member counts as negative numbers
If C, load them as positive
i think need to set changes in Rule file....
Pls suggest me how to write the load rule for this.
Thanks in advance.
Bunny
Edited by: Bunny on Feb 11, 2009 3:55 AMHi Bunny,
I dont know the setting on the top of my mind at rule file end . But I have one solution ( you might not like it).
1. Load the comlete data with negative values , then use IF ...ELSE condition and include @ABS function for the required ones.
2. I mean, apply the condition and use the function of ABS which will make -ve numbers +ve
Check out for other posts by other experts too.
If its development setup and if you can change the data source , try this.
Sandeep Reddy Enti
HCC
http://hyperionconsultancy.com/ -
Hi:
I need to load the data to a Cube with ABAP code in info-Package.... but am a Novice in ABAP..!
Need your help.
I need to load the data to the Cube only the records thta have "NETDUEDATE" gretaed than System Date
(SY-DATUM) and drop the records that do not meet the condition.
Not sure what would be the code / syntax for putting this condition!
Appreciate your help.
Thanks....PbsHi PBS,
You can write a ABAP logic for your requirement. In the data selction tab of the Info-package, give the type as abap routine and the variable in the type. You can implement the logic with IF-ENDIF condition.
Reffer:
ABAP routine in Infopackage
ABAP Routine in InfoPackage.
Regards,
Satyam -
HFM DATA LOAD WITH ODI HANGS LONG TIME
Hi all,
There's a very strange problem when I loading data from MS SQLServer to HFM with ODI. Specifically, there are 2 interfaces to 2 applications on the same HFM server. Data amount is about 1,300,000 and 650,000 separately.
The strange thing is when I execute interface individually, it sometimes works well. However when I execute the package contains 2 interfaces, the larger one almost hangs on about 10+ hours every time whether I use an agent or not.
After some research, it seems that the session hangs on because it cannot get return info from HFM but loading data has already completed. I found some similar problems on OTN like 64bit driver and jre compatible error OR deadlock on table. Different with this one. So, can anyone help on this? Much appreciate in advance!!!
BTW, ODI and HFM are on the same server but ODI repositary and source of interface are on another MS SQL data server. The version is as below:
HFM 11.1.1.3.0.956
ODI 11.1.1.6.0
win server 2003 x86
MS SQLServer 2008 R2
win server 2008 x64
Regards,
SteveHi SH,
source is MS SQLServer 2008 R2, staging area is on the source side, target is a HFM 11.1.1.3.0.956 based on SQLSERVER.
KM is a standard 'IKM SQL to Hyperion Financial Management Data'.
No transformation logic but only a filter to select data in current year.
Besides, I have do some performance tuning as guide tolds:
REM #
REM # Java virtual machine
REM #
set ODI_JAVA_HOME=D:\oracle\Java\jdk1.6.0_21
REM #
REM # Other Parameters
REM #
set ODI_INIT_HEAP=512m
set ODI_MAX_HEAP=1024m
set ODI_JMX_PROTOCOL=rmi
In Regedit:
EnableServerLocking: 1
MaxDataCacheSizeinMB :1000
MaxNumDataRecordsInRAM: 2100000
MultiServerMaxSyncDelayForApplicationChanges:300
MultiServerMaxSyncDelayForDataChanges:300
After some reaserch, I think the problem can be located at the HFM-ODI adapter or HFM side(maybe HFM cannot respond a completed info to ODI), do you have any idea? Thanks in advance -
Data loads at Repository, but error in dashboard, System DSN
Error in Dashboard UI:
ORA-01017: invalid username/password; logon denied at OCI call OCISessionBegin. [nQSError: 17014] Could not connect to Oracle database. (HY000)
While creating the System DSN with Oracle 11g, the error is as follows -
Unable to connect
SQLState=IM003
Specified driver could not be loaded due to system error 127
Any inputs would be appreciated.
Thanks
KSKhi
1.tnsping directly the OBAW from cmd prompt or putty
2.set the ODBC.INI tnsfile on linux box
3.set the tnsnames.ora file under Oracle client on windows box
4.try to directly view the data from repository
5. some times network issues, so ask your dba to bounce the database
6. take help of your DBA to investigate more from database side
hope iit helps
poinsts please
thanks
rakesh -
Time Data loading with missing format
Hi expert,
after we load time data, find the following missing information:
0CALDAY:
20.080.112 (Error)
2008.01.12 (correct one)
0CALMONTH:
200.801(Error)
2008.01(correct one)
0CALQUARTE
20.081(Error)
20081(correct one)
Could anyone let us know how to correct data with missing format to correct one and how to do this step by step? Many thank!What is the source for this data ?
Have it corrected in the source
Or
Correct in PSA and load it from there.
Does it happen only for a few records ?
Maybe you are looking for
-
Unable to run forms in oracle forms developer 10g
I am using oracle forms developer 10g on oracle application Release 12. When i run form i am getting error that weblistener is not running on default port 8888. Then i changed it to 8000. But still i am not able to run the forms.
-
Is mysql a free open source database or do you need a license for it in prd
is mysql a free open source database or do you need a license for it in production.
-
I am trying to setup my HP C4795 and I get an error message that says: Your printer has limited connectivity. Your printer is using IP address 169.254.134.134, which is not compatible with the IP address 192.168.1.2 of the PC. Power off your printer
-
I researched the issue in the forums and could not locate a fix for the issue. After computer "wakes" up, it provides the dialog box "This computer's local hostname "XXXXXX-iMac-4.local" is already in use on this network. The name has been changed t
-
How do i fix a portrait photograph that has harsh sunlight and shadows in PSE12?