SQLLOADER PROBLEM IN LOADING DATA TO MULTIPLE TABLES
My problem is I have to data from a flat file which consists 64 columns and 11040 records into 5 different tables.Other thin is I have to check that only UNIQUE record should goto database and then I have to generate a primary key for the record that came to database.
So I have written a BEFORE INSERT TRIGGER FOR EACH ROW for all the 5 tables to check uniques of the record arrived.
Now my problem is SQLLDR is loading only those number of records for all the tables which are in minimum to a table uniquely .i.e.,
TABLES RECORDS(ORGINALLY)
TIME 11
STORES 184
PROMOTION 20
PRODUCT 60
Now it is loadin only 11 records for all the problem
with regards
vijayankar
The easiest thing is to do data manipulation in the database; that's what SQL is good for.
So load your file into tables without any unique constraints. Then apply unique constraints using the EXCEPTIONS INTO... clause. This will populate your exceptions table with the rowid of all the non-unique rows. You can then decide which rows to zap.
If you don't already have an exceptions table you'll need to run utlexcpt.sql.
HTH
P.S. This isn't the right forum to be posting SQL*Loader enquiries.
Similar Messages
-
Loading data from multiple tables to multiple sheets of excel using SSIS
I have a requirement in which I want to load data from 13 tables to 13 respective sheets of single excel file using SSIS.
Can anyone know SSIS logic for developing package for this?see similar example here
http://visakhm.blogspot.in/2013/09/exporting-sqlserver-data-to-multiple.html
In your case you need to use loop to iterate through tables
First get list of tables in a object variable created in SSIS using INFORMATION_SCHEMA.TABLES view
Then add a for each loop based on ADO.NET variable enumerator to iterate through tables and inside loop follow method as in the above link to create the sheet first and populate it.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Loading data into multiple tables - Bulk collect or regular Fetch
I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
I have run into a couple of problems and have a few questions where I would like to seek advice:
1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
Notes:
1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
The structure of the procedure is as follows
Declare
dest_type is table of source_table%ROWTYPE;
dest_tab dest_type ;
iCount NUMBER;
cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
LOOP
FETCH source_cur -- BULK COLLECT
INTO dest_tab -- LIMIT 1000
EXIT WHEN source_cur%NOTFOUND;
FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
END LOOP ;
COMMIT ;
END ;
Edited by: user11368240 on Jul 14, 2009 11:08 AMAssuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
DECLARE
iCount NUMBER;
CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
FOR r IN source_cur
LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
COMMIT ;
END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler? -
Loading Data from multiple tables into essbase using ODI
Hi,
We have a scenario where data comes from multiple tables. I would like to know how will ODI load this data for the right combination of the membersHi,
I take it each data table has a field which maps to the other table. You can just drag the datastores on to the source interface and create a join between the tables.
Cheers
John
http://john-goodwin.blogspot.com/ -
Loading data into multiple tables from an excel
Can we load data in to multiple tables at a time from an excel through Utilities? If yes how? Please help me
Regards,
PallaviI would imagine that the utilities allow you to insert data from a spreadsheet into 1 and only 1 table.
You may have to write your own custom data upload using External Tables and a PL/SQL procedure to insert data from 1 spreadsheet into more than 1 table.
If you need any guidance on doing this let me know and I will happily point you in the right direction.
Regards
Duncan -
Loading data into multiple tables using sqlloader
Hi,
I am using sql loader to load the data from flat file into the data base
my file structure is as below
====================
101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
102,smith,mobile@@fax@@home,1234@@345@@234,123.40
103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
emp
empno ename sal
101 john 1234.40
102 smith 123.40
103 adams 2345.40
emp_comm
empno comm_mode comm_no_text
101 mobile 1234
101 fax 3425
101 home 1232
101 office 2345
101 email [email protected]
102 mobile 1234
102 fax 345
102 home 234
103 fax 1234
like this needs to insert the data using sql loader
my table structures
===============
emp
empno number(5)
ename varchar2(15)
sal number(10,2)
emp_comm
empno number(5) reference the empno of the emp table
comm_mode varchar2(10)
Comm_no_text varchar2(35)
now i want insert the file data into the specified structues
please help me out to achieve this using sql loader
(we are not using external tables for this)
Thanks & Regards.
Bala Sake
Edited by: 954925 on Aug 25, 2012 12:24 AMPl post OS and database details
You will need to split up the datafile in order to load into separate tables. The process is documented
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
HTH
Srini -
Loading data from multiple files to multiple tables
How should I approach on creating SSIS package to load data from multiple files to multiple tables. Also, Files will have data which might overlap so I might have to create stored procedure for it. Ex. 1st day file -data from au.1 - aug 10 and 2nd day
file might have data from aug.5 to aug 15. So I might have to look for max and min date and truncate table with in that date range.thats ok. ForEachLoop would be able to iterate through the files. You can declare a variable inside loop to capture the filenames. Choose fully qualified as the option in loop
Then inside loop
1. Add execute sql task to delete overlapping data from the table. One question here is where will you get date from? Does it come inside filename?
2. Add a data flow task with file source pointing to file .For this add a suitable connection manager (Excel/Flat file etc) and map the connection string property to filename variable using expressions
3. Add a OLEDB Destination to point to table. You can use table or view from variable - fast load option and map to variable to make tablename dynamic and just set corresponding value for the variable to get correct tablename
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Using FDM to load data from oracle table (Integration Import Script)
Hi,
I am using Integration Import Script to load data from oracle table to worktables in FDM.
i am getting following error while running the script.
Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done
Attaching the full error report
ERROR:
Code............................................. -2147217887
Description...................................... Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.
At line: 22
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 6260
IDENTIFICATION:
User............................................. ******
Computer Name.................................... *******
App Name......................................... FDMAPP
Client App....................................... WebClient
CONNECTION:
Provider......................................... ORAOLEDB.ORACLE
Data Server......................................
Database Name.................................... DBNAME
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... SCRTEST
Location ID...................................... 750
Location Seg..................................... 4
Category......................................... FDM ACTUAL
Category ID...................................... 13
Period........................................... Jun - 2011
Period ID........................................ 6/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
I am using the following script
Function ImpScrTest(strLoc, lngCatKey, dblPerKey, strWorkTableName)
'Oracle Hyperion FDM Integration Import Script:
'Created By: Dhananjay
'Date Created: 1/17/2012 10:29:53 AM
'Purpose:A test script to import data from Oracle EBS tables
Dim cnSS 'ADODB.Connection
Dim strSQL 'SQL string
Dim rs 'Recordset
Dim rsAppend 'tTB table append rs object
'Initialize objects
Set cnSS = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
'Connect to SQL Server database
cnss.open "Provider=OraOLEDB.Oracle.1;Data Source= +server+;Initial Catalog= +catalog+;User ID= +uid+;Password= +pass+"
'Create query string
strSQL = "Select AMOUNT,DESCRIPTION,ACCOUNT,ENTITY FROM +catalog+.TEST_TMP"
'Get data
rs.Open strSQL, cnSS
'Check for data
If rs.bof And rs.eof Then
RES.PlngActionType = 2
RES.PstrActionValue = "No Records to load!"
Exit Function
End If
'Loop through records and append to tTB table in location’s DB
If Not rs.bof And Not rs.eof Then
Do While Not rs.eof
rsAppend.AddNew
rsAppend.Fields("PartitionKey") = RES.PlngLocKey
rsAppend.Fields("CatKey") = RES.PlngCatKey
rsAppend.Fields("PeriodKey") = RES.PdtePerKey
rsAppend.Fields("DataView") = "YTD"
rsAppend.Fields("CalcAcctType") = 9
rsAppend.Fields("Amount") = rs.fields("Amount").Value
rsAppend.Fields("Desc1") = rs.fields("Description").Value
rsAppend.Fields("Account") = rs.fields("Account").Value
rsAppend.Fields("Entity") = rs.fields("Entity").Value
rsAppend.Update
rs.movenext
Loop
End If
'Records loaded
RES.PlngActionType = 6
RES.PstrActionValue = "Import successful!"
'Assign Return value
SQLIntegration = True
End Function
Please help me on this
Thanks,
Dhananjay
Edited by: DBS on Feb 9, 2012 10:21 PMHi,
I found the problem.It was because of the connection string.The format was different for oracle tables.
PFB the format
*cnss.open"Provider=OraOLEDB.Oracle.1;Data Source= servername:port/SID;Database= DB;User Id=aaaa;Password=aaaa;"*
And thanks *SH* for quick response.
So closing the thread......
Thanks,
Dhananjay -
Adding data to multiple tables using one form in Access 2010?
Hi All,
I have a access database with two tables and I want to create a single form to enter data into that tables.
How to adding data to multiple tables using one form in Access 2010?
I don't have to much knowledge of access database?
Please help me
Thanks
BalajiYou really don't enter identical data into 2 tables. You enter dat into one single table, and then you have an unique identifier that maps to another table (you have a unique relationship between two tables).
Maybe you need to read this.
http://office.microsoft.com/en-001/access-help/database-design-basics-HA001224247.aspx
Think about it this way... What is you update data in 2 tables, and then the data in one of those tables changes, but the data in the other table does NOT change. WHOOPS!! Now, you've got a BIG problem. For instance, you have a customer
named Bill Gates. In one Table you update Bill's address to 1835 73rd Ave NE, Medina, WA 98039 and in the other table you accidentally update Bill's address to 183 73rd Ave NE, Medina, WA 98039. Now you have 2 addresses for Bill. Why would
you want that??? Which is right? No one knows. If you have one address, you just have to update one address and if there is a mistake, you just have to update one address, but you don't have to waste time trying to figure out which is right
and which is wong...and then update the one that is wrong.
Post back with specific questions.
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it. -
Generic datasource by function module to fetch data from multiple tables?
I'm writing a function module to fetch price, for generic datasource.
At first, extract test is OK. But InfoPackage never stop when loading data to PSA in BW.
And I find the example codes:
OPEN CURSOR WITH HOLD S_CURSOR FOR
SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT
WHERE CARRID IN L_R_CARRID AND
CONNID IN L_R_CONNID.
ENDIF. "First data package ?
* Fetch records into interface table.
* named E_T_'Name of extract structure'.
FETCH NEXT CURSOR S_CURSOR
APPENDING CORRESPONDING FIELDS
OF TABLE E_T_DATA
PACKAGE SIZE S_S_IF-MAXSIZE.
IF SY-SUBRC <> 0.
CLOSE CURSOR S_CURSOR.
RAISE NO_MORE_DATA.
ENDIF.
S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
ENDIF.
There using Cursor to fetch data package by package, and raise exception NO_MORE_DATA to stop the loading process.
Now I fetch data from multiple tables, I don't think I can use Cursor.
Then How can I handle this?
Thanks a lot.Thanks
IF IT_999[] IS INITIAL.
SELECT A~KNUMH A~MATNR A~KSCHL VKORG VTWEG A~DATBI A~DATAB KBETR KMEIN KPEIN C~MTART APPENDING CORRESPONDING FIELDS OF
TABLE TP_DATA
FROM A999 AS A
INNER JOIN KONP AS B
ON A~KNUMH = B~KNUMH
INNER JOIN MARA AS C
ON A~MATNR = C~MATNR
* FOR ALL ENTRIES IN IT_999
WHERE
* A~KNUMH = IT_999-KNUMH AND
( ( A~KSCHL = 'ZPRC' AND VKORG = 'Z000' AND VTWEG = 'Z1' ) OR
( A~KSCHL = 'ZPRD' AND VKORG = 'A000' AND VTWEG = 'Y3' ) ) AND
* A~DATBI >= SY-DATUM AND
LOEVM_KO = ''.
SELECT A~KNUMH A~MATNR A~KSCHL VKORG VTWEG A~DATBI A~DATAB KBETR AS KHETR KMEIN KPEIN C~MTART APPENDING CORRESPONDING FIELDS OF
TABLE TP_DATA
FROM A999 AS A
INNER JOIN KONP AS B
ON A~KNUMH = B~KNUMH
INNER JOIN MARA AS C
ON A~MATNR = C~MATNR
* FOR ALL ENTRIES IN IT_999
WHERE
* A~KNUMH = IT_999-KNUMH AND
A~KSCHL = 'ZPR3' AND A~VKORG = 'I000' AND
* DATBI >= SY-DATUM AND
LOEVM_KO = ''.
ENDIF.
IF IT_997[] IS INITIAL.
SELECT A~KNUMH A~MATNR A~KSCHL VTWEG A~DATBI A~DATAB KBETR AS KHETR KMEIN KPEIN C~MTART APPENDING CORRESPONDING FIELDS OF
TABLE TP_DATA
FROM A997 AS A
INNER JOIN KONP AS B
ON A~KNUMH = B~KNUMH
INNER JOIN MARA AS C
ON A~MATNR = C~MATNR
* FOR ALL ENTRIES IN IT_997
WHERE
* A~KNUMH = IT_997-KNUMH AND
A~KSCHL = 'ZPRA' AND VTWEG = 'Y1' AND
* DATBI >= SY-DATUM AND
LOEVM_KO = ''.
ENDIF.
IF IT_996[] IS INITIAL.
SELECT A~KNUMH A~MATNR A~KSCHL A~DATBI A~DATAB KBETR AS KHETR KMEIN KPEIN C~MTART APPENDING CORRESPONDING FIELDS OF
TABLE TP_DATA
FROM A996 AS A
INNER JOIN KONP AS B
ON A~KNUMH = B~KNUMH
INNER JOIN MARA AS C
ON A~MATNR = C~MATNR
* FOR ALL ENTRIES IN IT_996
WHERE
* A~KNUMH = IT_996-KNUMH AND
A~KSCHL = 'ZPRB' AND
* DATBI >= SY-DATUM AND
LOEVM_KO = ''.
ENDIF.
SELECT MATNR "u7269u6599u53F7u7801
MEINH "u4ED3u50A8u5355u4F4Du7684u5907u7528u8BA1u91CFu5355u4F4D
UMREZ "u57FAu672Cu8BA1u91CFu5355u4F4Du8F6Cu6362u5206u5B50
UMREN "u8F6Cu6362u4E3Au57FAu672Cu8BA1u91CFu5355u4F4Du7684u5206u6BCD
FROM MARM
INTO CORRESPONDING FIELDS OF TABLE IT_MARM
FOR ALL ENTRIES IN TP_DATA
WHERE MATNR = TP_DATA-MATNR AND MEINH = TP_DATA-KMEIN.
LOOP AT TP_DATA.
IF TP_DATA-KPEIN NE 0.
TP_DATA-KBETR = TP_DATA-KBETR / TP_DATA-KPEIN.
TP_DATA-KHETR = TP_DATA-KHETR / TP_DATA-KPEIN.
ENDIF.
IF TP_DATA-KSCHL = 'ZPRA'.
* TP_DATA-MEINH = 'ZI'.
* TP_DATA-KSCHL = 'B4'.
IF TP_DATA-KMEIN = 'ZI'.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ELSE.
READ TABLE IT_MARM INTO WA_MARM1 WITH KEY MATNR = TP_DATA-MATNR MEINH = TP_DATA-KMEIN.
* READ TABLE IT_MARM INTO WA_MARM2 WITH KEY MATNR = TP_DATA-MATNR MEINH = 'CT'.
TP_DATA-KHETR = TP_DATA-KHETR * WA_MARM1-UMREN / WA_MARM1-UMREZ.
* * WA_MARM2-UMREZ / WA_MARM2-UMREN.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ENDIF.
ELSEIF TP_DATA-KSCHL = 'ZPRB'.
* TP_DATA-KSCHL = 'L0'.
* TP_DATA-MEINH = 'ZI'.
IF TP_DATA-KMEIN = 'ZI'.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ELSE.
READ TABLE IT_MARM INTO WA_MARM1 WITH KEY MATNR = TP_DATA-MATNR MEINH = TP_DATA-KMEIN.
* READ TABLE IT_MARM INTO WA_MARM2 WITH KEY MATNR = TP_DATA-MATNR MEINH = 'BAG'.
TP_DATA-KHETR = TP_DATA-KHETR * WA_MARM1-UMREN / WA_MARM1-UMREZ.
* * WA_MARM2-UMREZ / WA_MARM2-UMREN.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ENDIF.
ELSEIF TP_DATA-KSCHL = 'ZPRC' OR TP_DATA-KSCHL = 'ZPRD'.
* TP_DATA-MEINH = 'ZI'.
IF TP_DATA-KMEIN = 'ZI'.
TP_DATA-KHETR = TP_DATA-KBETR * '1.17'.
ELSE.
READ TABLE IT_MARM INTO WA_MARM1 WITH KEY MATNR = TP_DATA-MATNR MEINH = TP_DATA-KMEIN.
* READ TABLE IT_MARM INTO WA_MARM2 WITH KEY MATNR = TP_DATA-MATNR MEINH = 'WZI'.
TP_DATA-KBETR = TP_DATA-KBETR * WA_MARM1-UMREN / WA_MARM1-UMREZ.
* * WA_MARM2-UMREZ / WA_MARM2-UMREN.
TP_DATA-KHETR = TP_DATA-KBETR * '1.17'.
ENDIF.
ELSEIF TP_DATA-KSCHL = 'ZPR3'.
* TP_DATA-KSCHL = 'B2'.
IF TP_DATA-KMEIN = 'ZI'.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ELSE.
READ TABLE IT_MARM INTO WA_MARM1 WITH KEY MATNR = TP_DATA-MATNR MEINH = TP_DATA-KMEIN.
* READ TABLE IT_MARM INTO WA_MARM2 WITH KEY MATNR = TP_DATA-MATNR MEINH = 'BAG'.
TP_DATA-KHETR = TP_DATA-KHETR * WA_MARM1-UMREN / WA_MARM1-UMREZ.
* * WA_MARM2-UMREZ / WA_MARM2-UMREN.
TP_DATA-KBETR = TP_DATA-KHETR / '1.17'.
ENDIF.
ENDIF.
TP_DATA-MEINH = '01'.
MODIFY TP_DATA.
E_T_DATA-MATNR = TP_DATA-MATNR.
E_T_DATA-KSCHL = TP_DATA-KSCHL.
E_T_DATA-KHETR = TP_DATA-KHETR.
E_T_DATA-KBETR = TP_DATA-KBETR.
E_T_DATA-KMEIN = TP_DATA-KMEIN.
E_T_DATA-DATAB = TP_DATA-DATAB.
E_T_DATA-DATBI = TP_DATA-DATBI.
APPEND E_T_DATA.
CLEAR WA_MARM1.
CLEAR WA_MARM2.
ENDLOOP.
Edited by: Shen Peng on Oct 20, 2010 10:09 AM -
Problem in Loading data for clob column using sql ldr
Hi,
I am having problem in loading data for tables having clob column.
Could anyone help me in correcting the below script for ctrl file inorder to load the data which is in mentioned format.
Any help really appreciated.
Table Script
Create table samp
no number,
col1 clob,
col2 clob
Ctrl File
options (skip =1)
load data
infile 'c:\1.csv'
Replace into table samp
fields terminated by ","
trailing nullcols
no,
col1 Char(100000000) ,
col2 Char(100000000) enclosed by '"' and '"'
Data File(1.csv)
1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"
Error Encountered
ORA-01461: can bind a LONG value only for insert into a LONG column
Table sampThanks in advanceI can't reproduce it on my 10.2.0.4.0. CTL file:
load data
INFILE *
Replace into table samp
fields terminated by ","
trailing nullcols
no,
col1 Char(100000000) ,
col2 Char(100000000) enclosed by '"' and '"'
BEGINDATA
1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"Loading:
SQL> Create table samp
2 (
3 no number,
4 col1 clob,
5 col2 clob
6 );
Table created.
SQL> host sqlldr scott/tiger control=c:\temp\samp.ctl log=c:\temp\samp.log
SQL> select * from samp
2 /
NO
COL1
COL2
1
asdf
assasadsdsdsd"sfasdfadf"sdsdsa,ssfsf
2
sfjass
dksadk,kd,ss"dfdfjkdjfdk"sasfjaslaljs
NO
COL1
COL2
SQL> SY. -
Problem with loading data to Essbase
Hi All,
I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
[2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
[2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
[2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
[2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
[2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
[2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
[2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
[2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory]) -
Insert data into multiple tables
Hi all,
I've a requirement where i need to insert data into multiple tables using PL/SQL procedure
Procedure should have two parameters
1. Table Name (parameter1)
2. Data (parameter2)
Based on these two parameters i need to insert data into table (parameter1) using data (parameter2)
ex:
Procedure insert_data (p_table IN VARCHAR2
,p_data IN -- what should be the datatype?
IS
l_statement VARCHAR2(2000);
BEGIN
-- insert data into tables
INSERT INTO p_table
values (....);
END insert_data;Thanks in advance!!BEDE wrote:
Amen to that!
So, I believe a better approach would be the following...
Suppose you have N datafiles with the same structure, and you wish to insert into the database the data from all those files.
For that, you should have a single table, named, say incoming_file_data, which should be structured more or less like below:
create table incoming_file_data (
filename varchar2(250) not null -- name of the file inserted from
,file_time timestamp -- timestamp when the data was inserted
,... -- the columns of meaningful data contained in the lines of those files
);And you will insert the data from all those files in this table, having normally one transaction for each file processed, for otherwise, when shit happens, some file may only get to be partially inserted into the table...
Maybe one good approach would be to create dynamically an external table for the file to be loaded, and then execute dynamically insert select into the table I said, so that you will have only one insert select for one file instead of using utl_file... RTM on that.If the file structures are the same, and it's just the filename that's changing, I would have a single external table definition, and use the alter table ... location ... statement (through execute immediate) to change the filename(s) as appropriate before querying the data. Of course that's not scalable if there are multiple users intenting to use this, but generally when we talk about importing multiple files, it's a one-user/one-off/once-a-day type of scenario, so multi-user isn't a consideration. -
I am loading data into a table I created which includes a column "Description" with a data type VARCHAR2(1000). When I go to load the data which is less than 1000 characters I receive the following error message:
Record 38: Rejected - Error on table SSW_INPUTS, column DESCRIPTION.
Field in data file exceeds maximum length
I have increased the size of the column but that does not seem to fix the error. Does anyone know what this error means? Another thought is that I have created the "Description" column to large...which can't be true because I should receive the error when I create the table. Plus I already inputted data into a similar table with similar data and had no problems!
Someone please help...
Thank you,
April.Note that I'm assuming Oracle8(i) behavior. Oracle9 may treat Unicode differently.
Are you inserting Unicode data into the table? Declaring a variable as varchar2(1000) indicates that Oracle should reserve 1000 bytes for data. If you're inserting UTF-8 encoded data, each character may take up to 3 bytes to store. Thus, 334 characters of data could theoretically overflow a varchar2(1000) variable.
Note that UTF-8 is designed so that the most commonly used characters are stored in 1 byte, less commonly used characters are stored in 2 bytes, and the remainder is stored in 3 bytes. On average, this will require less space than the more familiar UCS-2 encoding which stores every character as 2 bytes of data.
Justin -
Problem in Loading data from R/3 to ODS
Whn i load data from r3 - tables ,I could not get the data to the data target ODS.
What is the problem ? can any one help this out .Hi Lakshmanan,
Important point : It is always good to detail your issue to get a prompt response.
There could be lot of issues.
1. Find the datasource which is feeding this ODS and
Goto R/3 system -> RSA3 and check whether any records are populated, If no records are populated then there is issue with extracting the records..
2. if records are populating in the first step..on the BW Side, check the PSA table..and if you find any records in PSA then this is issue on BW Side..and there could be many issues..
May be if you can provide more information..we will of help to you...
Regards, Vj
Maybe you are looking for
-
Display the Customer/Vendor Name in the General Ledger Report
**I have Questions about General Ledger Report in SAP Business One. How can Display the Customer/Vendor Name in the General Ledger Report.** *Just would like to ask if its possible to display the Customer/Vendor Name in the General Ledger Report? The
-
Dynamic images in PDF (Adobe interactive form)
Hi All, I have stored image in database using BLOB. I retrieve the image using WDWebResourceType.getWebResourceTypeForFileExtension(extn);//get Type IWDResource res = WDResourceFactory.createCachedResource(CSVFile, Filename, type1,t
-
How do you spell check in PE 11???
Can someone please advise how to spell check in PE 11???
-
TRANSACTION LAUNCHER--CALLING CRM Z TRANSACTION THROUGH WEB UI
As per the requirement, we are trying to call CRM Z transaction through web UI. We have used the Object TSTC and created a replica of the object ZTSTC and used the method execute. Made the necessary config for navigation bar and Business role. We co
-
Charset decoding problem. UTF-8
Hi, I'm using Jakarta Commons HttpClient and am having trouble reading the response to a GetMethod. // execute method byte[] bytes = method.getResponseBody(); String response = new String(bytes, "UTF-8"); // method.getResponseCharSet() returns "utf-8