Unable to load data into Essbase cube using Essbase Studio
Hi
We are creating an essbase cube using Essbase Studio using flat files as data sources.
We have taken different hierarchies into different flat files and created one fact file having dimension intersection along with data.
We are able to create the cube and the hierarchy but not able to load any data.
We are getting the following error
Failed to deploy Essbase cube.
Caused by: Unable to perform dataload from more than one flat file.
Could anyone please help on this?
Oh this was killing me, so I did this test in 11.1.1.3:
1) Excel 2007 format -- no go, Essbase didn't see it
2) Excel 2003 format, three sheets -- only the first sheet was read into an empty rule
3) Excel 2003 format, one sheet -- the first sheet was read into an empty rule
4) Excel 95 format, one sheet -- the first sheet was read into an empty rule
The lesson?
1) Excel 2007/2010 sheets don't work (no surprise there as the .xlsx format isn't supported).
2) Excel 2003 and lower (hey, if you have Excel 4, I'll bet that works as well) work, but only the first sheet is recognized.
Regards,
Cameron Lackpour
Similar Messages
-
Unable to load data into Planning cube
Hi,
I am trying to load data into a planning cube using a DTP.
But the system throws a message saying that, real time data loading cannot be performed to the planning cube.
What would be the possible reason for the same?
Thanks & Regards,
Surjit PHi Surjit,
To load data into cube using DTP, it should be put in loading mode only.
The real time cube behaviour is set to planning mode only when the data is input to the cube via the planning layouts / or through the Fileupload program.
You can change the behaviour of the cube in RSA1 -> Right click on the cube -> Change Real-time load behaviour and select the first option(real time cube can be loaded with data; planning not allowed)
Best Rgds
Shyam
Edited by: Syam K on Mar 14, 2008 7:57 AM
Edited by: Syam K on Mar 14, 2008 7:59 AM -
Getting error while loading Data into ASO cube by flat file.
Hi All,
i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
does anyone have solution.
Regards,
VMAre you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
Cheers
John
http://john-goodwin.blogspot.com/ -
How to load data into html:select using Struts ?
How to load data into <html:select> using Struts ?
I can not load an array or collection (static or dynamic data) into drop down list control by <html:select /> Struts.
please use:
<html:select >
<html:options />
</html:select >
Please help me. please detail it. thanks a lot.
Message was edited by:
tranminhmanIn order to load a collection or array of data you can use <html:select> with <html: options collection="" name=""/>
here collection attribute refers to the Arraylist or Array of data and name is the name of the Form bean.
Hope this helps...
Chaitanya V -
Unable to load data into any application.database on server.
I have a rather odd problem that's been vexing my for a few days.
I am unable to do a data import into any cubes within Essbase, its as if the cube is in read only mode, though everything seems OK. I'm not running any sort of archiving, and to manually check, within essmsh, I did an "alter database end archive". I've tried with our existing and unchanged data load script as well as from within the EAS ( Right-click Data Load ). If I check the processes running on the server ( LINUX server ) the ESSSVR process is the top process eating 100% CPU. I can do outline builds OK.
From within the application log file, the last few entries are:
[Wed Apr 3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021044)
Starting to execute query
[Wed Apr 3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021045)
Finished executing query, and started to fetch records
[Wed Apr 3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021000)
Connection With SQL Database Server is Established
[Wed Apr 3 09:44:58 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1003040)
Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
[Wed Apr 3 09:45:19 2013]Local/OP_ACC/Accounts/svc_biserver/Info(1021047)
Finished fetching data
Nothing has changed on this server for a few weeks so I'm somewhat flummoxed and no other errors show in any other logs ( nohup.out ).
Essbase 11.1.1.2
Redhat LINUX_x64 2.6Yes, I tried rebooting the whole box, server has 9Gb free space, same story when I load just 1 row via a text file in the EAS, it seems to hang and the ESSSVR process for that cube goes to 100%.
I've tried a few different cubes, and they all have the same problem, so I suspect its something to do with Essbase itself rather than the specific cube I'm having problems with.
This is our test server I'm experiencing the problem with and I tried migrating 1 app from UAT back to test which still didnt work.
Stumped! -
How to load data into user tables using DIAPIs?
Hi,
I have created an user table using UserTablesMD object.
But I don't have know how to load data into this user table. I guess I have to use UserTable object for that. But I still don't know how to put some data in particular column.
Can somebody please help me with this?
I would appreciate if somebody can share their code in this regard.
Thank you,
SudhaYou can try this code:
Dim lRetCode As Long
Dim userTable As SAPbobsCOM.UserTable
userTable = pCompany.UserTables.Item("My_Table")
'First row in the @My_Table table
userTable.Code = "A1"
userTable.Name = "A.1"
userTable.UserFields.Fields.Item("U_1stF").Value = "First row value"
userTable.Add()
'Second row in the @My_Table table
userTable.Code = "A2"
userTable.Name = "A.2"
userTable.UserFields.Fields.Item("U_1stF").Value = "Second row value"
userTable.Add()
This way I have added 2 lines in my table.
Hope it helps
Trinidad. -
Loading data into HANA DB (using hdbsql client) with Control file.
Hi,
I am working on a Project where the requirement is to load data from a csv file into HANA Database.
I am using HDBSQL, command line client on a windows machine to upload the data into HANA DB on a linux server.
I am able to successfully use the HDBSQL to export the file.
I have the following questions w.r.t to Bulk uploading data from CSV:
Where should the CSV file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where should the control file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where will the error file reside in case of errors?
I am new to this and any help is much appreicated.
Thanks,
ShreeshaHi Shreesha,
Where should the CSV file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where should the control file reside? Can this be in the windows machine or is it mandatory to have in the dropbox location.
Where will the error file reside in case of errors?
We need to create the DATA,CONTROL and ERROR folders on the linux server on which HANA is installed.( or a server accessible to SAP HANA server )
Hence we need to SFTP the file to HANA server to further load into Table.
Also have a look on the similar requirement on which i worked
SAP HANA: Replicating Data into SAP HANA using HDBSQL
Regards,
Krishna Tangudu -
We want to set up a delta refresh from R/3 data that will pull data into two cubes. One cube I want all the records to be loaded, in the second cube I want to filter the records that are loaded into that cube. I can figure out how to load the data into two cubes, but can I place a filter on one of the loads. (i.e. only load records of where type = 'X')
Thanks,
ChrisYou can do that in the Update Rules to the second cube... In the Start Routine...
DELETE FROM DATA_PACKAGE
WHERE type EQ 'X'.
(Please, verify the right syntax)
Then with only one Infopackage, you can load both cube with different conditions.
Hope it helps.
Regards,
Luis -
Unable to load Dimension into Hyperion planning using ODI ?
Hi All
We are trying to load Dimension into hyperion planning Ver. 11.1.2 using ODI. We have created the interface and mapped the source csv file to target planmning application using ODI KM.
LKM file to SQL
IKM SQL to Planning
We get success in ODI's Operator but the Dimension is not updated nor new data is insreted.
We get this message in Error file:-
+
Retail Format,Parent,Alias: Default,Data Storage,Two Pass Calculation,Smart List,Data Type,Plan Type (APlan),Error_Reason
ABC,Total Format,,StoreData,,,,Aplan,Cannot load dimension member, error message is: java.lang.RuntimeException: Fetch of saved member "ABC" failed.
+
We get this message in log files:-
+
2010-07-22 07:04:06,550 INFO [DwgCmdExecutionThread]: Oracle Data Integrator Adapter for Hyperion Planning - Release 9.3.1.1
2010-07-22 07:04:06,550 INFO [DwgCmdExecutionThread]: Connecting to planning application [******] on [********]:[11333] using username [admin].
2010-07-22 07:04:06,597 INFO [DwgCmdExecutionThread]: Successfully connected to the planning application.
2010-07-22 07:04:06,597 INFO [DwgCmdExecutionThread]: The load options for the planning load are
Dimension Name: Retail Format Sort Parent Child : false
Load Order By Input : false
Refresh Database : true
2010-07-22 07:04:06,612 INFO [DwgCmdExecutionThread]: Begining the load process.
2010-07-22 07:04:06,612 DEBUG [DwgCmdExecutionThread]: Number of columns in the source result set does not match the number of planning target columns.
2010-07-22 07:04:06,659 INFO [DwgCmdExecutionThread]: Load type is [Load dimension member].
2010-07-22 07:04:06,675 ERROR [DwgCmdExecutionThread]: Record [[ABC, Total Format, null, null, StoreData, null, null, null, null, null, null, null, null, null, Aplan, null, null, null]] was rejected by the Planning Server.
2010-07-22 07:04:06,675 INFO [DwgCmdExecutionThread]: Planing cube refresh operation initiated.
2010-07-22 07:04:08,425 INFO [DwgCmdExecutionThread]: Planning cube refresh operation completed successfully.
2010-07-22 07:04:08,425 INFO [DwgCmdExecutionThread]: Load process completed.
+
Please help out .....Can you check the error log file as you have posted the log file, it should give more information to why the records were rejected.
Cheers
John
http://john-goodwin.blogspot.com/ -
Unable to load data in to table using sqlloader
Hi,
Oracle Version :10.2.0.1
Operating system:windows Xp
I was unable to load the data in to table from csv file .Can any one please help me .
Here is the output of my log file
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Jun 3 12:43:22 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: ach_staging.ctl
Data File: E:\SQL LOADER\ACH_STAGING.csv
Bad File: E:\SQL LOADER\load_bad.bad
Discard File: E:\SQl LOADER\emp.dsc
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table ACH_STAGING, loaded from every logical record.
Insert option in effect for this table: INSERT
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
ACH_CODE FIRST * , O(") CHARACTER
LOAN_CODE NEXT * , O(") CHARACTER
LOAN_TYPE NEXT * , O(") CHARACTER
TRAN_ID NEXT * , O(") CHARACTER
BO_CODE NEXT * , O(") CHARACTER
BO_NAME NEXT * , O(") CHARACTER
ST_CODE NEXT * , O(") CHARACTER
ACH_TYPE NEXT * , O(") CHARACTER
ACH_EFFECTIVE_DATE NEXT * , O(") CHARACTER
AMT NEXT * , O(") CHARACTER
CHECK_ACCNT_NO NEXT * , O(") CHARACTER
ABA_CODE NEXT * , O(") CHARACTER
ACH_STATUS NEXT * , O(") CHARACTER
TRAN_STATUS NEXT * , O(") CHARACTER
IS_HOLD NEXT * , O(") CHARACTER
IS_CANCELLED NEXT * , O(") CHARACTER
COMMENTS NEXT * , O(") CHARACTER
UPDATED_BY NEXT * , O(") CHARACTER
DATE_UPDATED NEXT * , O(") CHARACTER
CREATED_BY NEXT * , O(") CHARACTER
DATE_CREATED NEXT * , O(") CHARACTER
LOAN_TRAN_CODE NEXT * , O(") CHARACTER
ACH_AUTH NEXT * , O(") CHARACTER
REBATE_AMT NEXT * , O(") CHARACTER
PROMOTION_AMT NEXT * , O(") CHARACTER
TDC_ACH_NO NEXT * , O(") CHARACTER
INST_NUM NEXT * , O(") CHARACTER
DISABLE_ACH NEXT * , O(") CHARACTER
STMT_NO NEXT * , O(") CHARACTER
NEW_LOAN_TRAN_CODE NEXT * , O(") CHARACTER
REVOKED_BY NEXT * , O(") CHARACTER
REVOKED_DATE NEXT * , O(") CHARACTER
CHECK_STATUS NEXT * , O(") CHARACTER
value used for ROWS parameter changed from 64 to 30
Record 1: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 2: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 3: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 4: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 5: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 6: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 7: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 8: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 9: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 10: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 11: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 12: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 13: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 14: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 15: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 16: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 17: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 18: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 19: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 20: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 21: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 22: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 23: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 24: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 25: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 26: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 27: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 28: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 29: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 30: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 31: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 32: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 33: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 34: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 35: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 36: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 37: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 38: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 39: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 40: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 41: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 42: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 43: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 44: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 45: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 46: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 47: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 48: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 49: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 50: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
Record 51: Rejected - Error on table ACH_STAGING, column ACH_EFFECTIVE_DATE.
ORA-01830: date format picture ends before converting entire input string
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table ACH_STAGING:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255420 bytes(30 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 60
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Thu Jun 03 12:43:22 2010
Run ended on Thu Jun 03 12:43:23 2010
Elapsed time was: 00:00:00.17
CPU time was: 00:00:00.10
{code}
and the data from the CSV file is
{code}
1767641 7537506 ILP ADV 506703 MICHELLE WHITE -40 CRE 07-NOV-08 01.36.04.000000000 PM 650 INP PRO N N 54564 06-NOV-08 06.06.28.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2060997 PPD 0 0 0 N 1 ACH
1767642 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 01-DEC-08 12.00.00.000000000 AM 76.5 INP PRO N N Updated During EOD PAY : ACH 1 28-NOV-08 09.00.17.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2061201 PPD 0 0 1 N 1 ACH
1767643 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-DEC-08 12.00.00.000000000 AM 76.5 INP PRO N N Updated During EOD PAY : ACH 1 15-DEC-08 09.00.16.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2061614 PPD 0 0 2 N 1 ACH
1767644 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 02-JAN-09 12.00.00.000000000 AM 76.5 INP PRO N N Updated During EOD PAY : ACH 1 31-DEC-08 09.00.55.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2063375 PPD 0 0 3 N 1 ACH
1767645 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-JAN-09 12.00.00.000000000 AM 76.5 INP PRO N N Updated During EOD PAY : ACH 1 15-JAN-09 09.01.10.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2064023 PPD 0 0 4 N 1 ACH
1767646 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 02-FEB-09 12.00.00.000000000 AM 76.5 INP PRO N N Updated During EOD PAY : ACH 1 30-JAN-09 09.00.22.000000000 PM 54562 06-NOV-08 01.36.04.000000000 PM 2064639 PPD 0 0 5 N 1 ACH
1767647 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 17-FEB-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 6 N 1 ACH
1767648 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 02-MAR-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 7 N 1 ACH
1767649 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-MAR-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 8 N 1 ACH
1767650 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 01-APR-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 9 N 1 ACH
1767651 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-APR-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 10 N 1 ACH
1767652 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 01-MAY-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 11 N 1 ACH
1767653 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 18-MAY-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 12 N 1 ACH
1767654 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 01-JUN-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 13 N 1 ACH
1767655 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-JUN-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 14 N 1 ACH
1767656 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 01-JUL-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 15 N 1 ACH
1767657 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 16-JUL-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 16 N 1 ACH
1767658 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 03-AUG-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 17 N 1 ACH
1767659 7537506 ILP PAY 506703 MICHELLE WHITE -40 DEB 17-AUG-09 12.00.00.000000000 AM 76.5 NOP NOP N Y Cancelled during Payment By -> BUY : ACH 54605 13-FEB-09 09.03.23.000000000 AM 54562 06-NOV-08 01.36.04.000000000 PM 1778544 PPD 0 0 18 N 1 ACH
{CODE}
Thanks & Regards,
Poorna Prasad.Hi,
At last i was able to insert the data into the table but here i am facing another problem i the csv file i am having some null values in the data because of that only few records was inserted and remaining data was not inserted.
Here is the syntax what i am using to insert even null values are present
ACH_EFFECTIVE_DATE "to_timestamp(:ACH_EFFECTIVE_DATE,'DD-MON-RR HH.MI.SSXFF AM')" NULLIF ACH_EFFECTIVE_DATE=BLANKSand the error what i am getting is
E:\SQL LOADER>sqlldr userid=rr/rr control=ach_staging.ctl log=ss1.log
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Jun 4 12:24:32 2010
Copyright (c) 1982, 2005, Oracle. All rights reserved.
SQL*Loader-350: Syntax error at line 16.
Expecting "," or ")", found keyword nullif.
EFFECTIVE_DATE,'DD-MON-RR HH.MI.SSXFF AM')" NULLIF ACH_EFFECTIVE_DATE=
^
{code}
can any one please help me what is the correct syntax i need to user here .
Thanks & Regards,
Poorna Prasad. -
Error while loading data into the cube
Hi,
I loaded data on to PSA and when I am loading the data to the cube through DataTransferProcess, I get an error (red color).
Through "Manage", I could see teh request in red. How do I get to knoe the exact error? Also what could be the possibel reason for this?
Also can some one explain the Datatransfer process(not in process chain)?
Regards,
SamHi Sam
after you load the data through DTP(after click on execute button..) > just go to monitor screen.. in that press the refresh button..> in that it self.. you can find the logs..
otherwise.. in the request screen also.. beside of the request number... you can see the logs icon.. you can click on this..
DTP means..
DTP-used for data transfer process from psa to data target..
check thi link..for DTP:
http://help.sap.com/saphelp_nw04s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
to load data in to the datatargets or infoproviders like DSO, cube....
in your case.. the problem may be.. check the date formats.. special cherectrs... and
REGARDS
@JAY -
Loading data into Fact/Cube with surrogate keys from SCD2
We have created 2 dimensions, CUSTOMER & PRODUCT with surrogate keys to reflect SCD Type 2.
We now have the transactional data that we need to load.
The data has a customer id that relates to the natural key of the customer dimension and a product id that relates to the natural key of the product dimension.
Can anyone point us in the direction of some documentation that explains the steps necessary to populate our fact table with the appropriate surrgoate key?
We assume that we need to have an lookup table between the current version of the customer and the incoming transaction data - but not sure how to go about this.
Thanks in advance for your help.
LauraHi Laura
There is another way to handling SCD and changing Facts. This is to use a different table for the history. Let me explain.
The standard approach has these three steps:
1. Determine if a change has occurred
2. End Date the existing record
3. Insert a new record into the same table with a new Start Date and dummy End Date, using a new surrogate key
The modified approach also has three steps:
1. Determine if a change has occurred
2. Copy the existing record to a history table, setting the appropriate End Date en route
3. Update the existing record with the changed information giving the record a new Start Date, but retaining the original surrogate key
What else do you need to do?
The current table can use the surrogate key as the primary key with the natural key being being a unique key. The history table has the surrogate key and the end date in the primary key, with a unique key on the natural key and the end date. For end user queries which in more than 90% of the time go against current data this method is much faster because only current records are in the main table and no filters are needed on dates. If a user wants to query history and current combined then a view which uses a union of the main and historical data can be used. One more thing to note is that if you adopt this approach for your dimension tables then they always keep the same surrogate key for the index. This means that if you follow a strict Kimball approach to make the primary key of the fact table be a composite key made up of the foreign keys from each dimension, you NEVER have to rework this primary key. It always points to the correct dimension, thereby eliminating the need for a surrogate key on the fact table!
I am using this technique to great effect in my current contract and performance is excellent. The splitter at the end of the map splits the data into three sets. Set one is for an insert into the main table when there is no match on the natural key. Set two is when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the current row needs to be copied into history, setting the End Date to the system date en route. Set three is also when there is a match on the natural key and the delta comparison has determined that a change occurred. In this case the main record is simply updated with the Start Date being reset to the system date.
By the way, I intend to put a white paper together on this approach if anyone is interested.
Hope this helps
Regards
Michael -
Loading data into multiple tables using sqlloader
Hi,
I am using sql loader to load the data from flat file into the data base
my file structure is as below
====================
101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
102,smith,mobile@@fax@@home,1234@@345@@234,123.40
103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
emp
empno ename sal
101 john 1234.40
102 smith 123.40
103 adams 2345.40
emp_comm
empno comm_mode comm_no_text
101 mobile 1234
101 fax 3425
101 home 1232
101 office 2345
101 email [email protected]
102 mobile 1234
102 fax 345
102 home 234
103 fax 1234
like this needs to insert the data using sql loader
my table structures
===============
emp
empno number(5)
ename varchar2(15)
sal number(10,2)
emp_comm
empno number(5) reference the empno of the emp table
comm_mode varchar2(10)
Comm_no_text varchar2(35)
now i want insert the file data into the specified structues
please help me out to achieve this using sql loader
(we are not using external tables for this)
Thanks & Regards.
Bala Sake
Edited by: 954925 on Aug 25, 2012 12:24 AMPl post OS and database details
You will need to split up the datafile in order to load into separate tables. The process is documented
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
HTH
Srini -
Hello All,
I have a job to load data from SQL Server to SAP BW. I have followed the steps got from SAP wiki to do this.
1. I have created an RFC connection between two servers(SAP and BODS Job Server)
when I schedule and start the job immediately from the SAP BW, i get this error and it aborts the RFC connection....
"Error while executing the following command: -sJO return code:238"
Error message during processing in BI
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.
System Response
A caller 01, 02 or equal to or greater than 20 contains an error meesage.
Further analysis:
The error message(s) was (were) sent by:
Scheduler
Any help would be appreciated......
Thanks
Praveen...Hi Praveen,
I want to know which version of BODS you are using for your development of ETL jobs?.
If it's BODS 12.2.2.2 then you will get this type of problems frequently as in BODS 12.2.2.2 version , only two connection
is possible to create while having RFC between BW and BODS.
So , i suggest if you are using BODS 12.2.2.2 version , then upgrade it with BODS 12.2.0.0 with Service PACK 3 and Fix Pack3 .
AS in BODS 12.2.3.3. we have option of having ten connection parallely at a time which helps in resolving this issues.
please let me know what is your BODS version and if you have upgraded your BODS to SP3 with FP3 , whether your problem is resolved or not..
All the best..!!
Thanks ,
Shandilya Saurav -
Error while loading the data into the cube using DTP.
No SID found for the value'UL' of characterstics 0BASE_UOM.
Please give me the idea ,to solve the error.
Thanks&Regards
Syam Prasad Dasarihttps://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=6733764
https://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=5062570
Maybe you are looking for
-
I have set up MAMP on my Mac and have set up a user in phpmyadmin. I have downloaded wordpress and put it in the mamp folder. I have defined the site at MAMP/htdocs in Applications. I have started to look at the files through the index.php files near
-
Document Display Options from portlet
How do I gain access to the user setting for the "Document Display Options" from a portlet? I want my portlet to honor this setting when creating links for the user to click on such that they open in a new window or in the main browser window as spec
-
Importing files into DVD SP problems
Just now having the following problem. When trying to import files into DVDSP, some files import correctly, while others show either the audio or video component of the file in red. checking error logs shows that there are errors on encoding either -
-
Problem Synchronizing Addressbook CONTACTPHOTOS/LOGOS to ipod touch
Can anybody help with the above mentioned problem? Everything else works fine - only the thumbnail photos and or Company Logos from the Addressbook of my Mac mini are not Synchronized, all other contact details transfer fine to my ipod touch?! Just t
-
Error reusing function with textfield
I'm trying to reuse one of my function that contains a textfield and it's giving me and error on the second time using the formatLink function.. Below is my code: //import classes import com.greensock.TweenLite; import com.greensock.easing.*; import