Load Data / Upload Script Problem
Hello:
I hope everyone is having a nice weekend.
I have a very weird situation.
I'm using Internet Explorer and I was loading data. No trouble there. I proceeded to upload a script. I selected the script file and gave it a script name. I clicked Upload and it appears on the screen. I then clicked on the script, the script editor screen appeared in bright red. When I click Run, I get a message on the bottom of the screen (next to my Internet Explorer icon - above my Start button) saying Error On Page. Yet, no specific error appears.
So, I switch over to Firefox and I'm successfully able to run the same script. Now, here's what is weird. I proceeded to upload more data (I'm still in Firefox). I was successful. I go to SQL Workshop, Object Browser, Browse, Tables. A list of tables appears on the left side. I want to view the table definition for the table I just loaded. I select the table and instead of see the table definition displayed, the area is blank. I have no table definition.
I'm a bit confused. I'm having problems loading/creating/running scripts in Internet Explorer which tells me I have an Error on Page but doesn't display any errors.
While at the same time, I'm having problems working with tables in Firefox.
Does anyone have any idea what could be causing this strange problem?
I can understand, probably, one browser having trouble, but both browsers on two different tasks. This makes no sense to me.
Any help you can give me would be great!
Thank you.
Hello:
Thank you for replying.
Yes, I do have those two lines beneath my Alias statement. Though, each line had a typo so I fixed them. I rebooted my machine.
After fixing the typos, I went to Firefox to see if I can view the table definitions and yes I am. So the typos did fix the Firefox problem.
I then went to Internet Explorer and tried to click on the SQL script and the same thing is still happening. I clicked the script that I uploaded and a red screen appears. If I try to click anything the Error on Page still appears on the bottom.
The script does work in Firefox. I'm not sure why Internet Explorer is not working.
Should I have done something else after modifying my dads file? Is there some sort of command or do I have to restart a service in order for the updated dad file to be recognized?
Thanks for your help. I hope we can troubleshoot the IE problem.
Similar Messages
-
How can I load data with Scripts on *FDM* to a HFM target System????
Hi all!
I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
Thanks for helpHi,
Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
Regards,
Matt -
Flash based uploader script problem. V18.0 came with this bug!
i am a web developer. everything works fine on firefox every time and my first browser which i always use it is firefox... but only a problem, never solved. FLASH UPLOAD PROBLEM.. there is a IO Error problem and it has never been solved... But with 18.0 version update , i had a new problem with Uploadify (flash based upload script). There is a session problem. this problem has never been before 18.0.. and my script works on other web browsers correctly now.
i post data PHPSESSID from flash plugin, then write to top of php uploader script session_id(POST_DATA) ; session_start(); but still problem has not been solved.I got exact same issue... cant upload files with the flash uploader... it just closes and the files doesnt get finished uploading.
This kinda sucks as you cant upload more then one file with the browser uploader.... -
Sql loader - loading data into database problem
Hi,
I am facing problem in loading data using ctl file and a data file into oracle db.
The table I am entering has the following structure:
CREATE TABLE "ENRCO07"."TEST"
"NAME" VARCHAR2(50 BYTE),
"MOD_DATE" DATE DEFAULT CURRENT_TIMESTAMP NOT NULL ENABLE
My ctl file has structure as:
OPTIONS (DIRECT=FALSE, ERRORS=1000)
LOAD DATA
APPEND
INTO TABLE TEST
truncate
FIELDS TERMINATED BY ";"
TRAILING NULLCOLS
NAME,
MOD_DATE DATE 'YYYYMMDDHH24MISS'
I tried a lot with MOD_DATE format since it was showing the error null cannot be inserted and other errors were also encountered.
My problem statement is :
I want the default date as current_timestamp in the format "'YYYYMMDDHH24MISS" else the date that comes from the data file to be loaded into oracle db.
I can't alter the DDL only ctl can be altered to get to the solution.
I am new to this , kindly help.
Thanks
AbhinavThanks for the reply but
still the problem is
if my data file has records as:
abhi1;20120416151900
abhi2;20120417151700
abhi3;
abhi4;20120416151900
and the ctl as:
OPTIONS (DIRECT=FALSE, ERRORS=1000)
LOAD DATA
APPEND
INTO TABLE TEST
truncate
FIELDS TERMINATED BY ";"
TRAILING NULLCOLS
USER_NAME CHAR NULLIF (USER_NAME=BLANKS),
MOD_DATE DATE 'YYYYMMDDHH24MISS' NULLIF (MOD_DATE=BLANKS)
The entered data in the db is:
abhi1 16-APR-12
abhi2 17-APR-12
abhi4 16-APR-12
the data with missing date is not loaded.
Thanks -
Data migration scripts problems
Hello,
I've been using SQL Developer for a couple of months now ; I'm working on the migration from a SQL Server database to Oracle 10g, and the migration features of the product really saved me a lot of time so far.
However, there is a couple of small problems with the generated data migration scripts :
- on the SQL Server side, the script that calls for bcp specifies no encoding ; as a result every Unicode information is lost. I think that by default you should at least call bcp with the option of keeping the original encoding information.
- on the Oracle Side, empty strings ( = CHR(00) ) are converted into a whitespace character ( = ' '), when they shouldn't. This is all the more annoying to correct manually when you have a lot of tables, since there's one control file per table.
Apart from that, everything's fine :)
Regards,
Isabelle.- on the Oracle Side, empty strings ( = CHR(00) ) are
converted into a whitespace character ( = ' '), when
they shouldn't. This is all the more annoying to
correct manually when you have a lot of tables, since
there's one control file per table.You should be aware that Oracle treats empty strings as NULL while SQLServer doesn't. It may be that this is an attempt to avoid problems with "not null" columns. On sqlserver, an empty string is OK in a not null column but not in Oracle. -
Weird problem with loading data from an XML using a for loop
Hi,
I have a strange problem. I have encountered this thing many a times but still don't know the proper workaround for it.
I am trying to load swf file, a video file or an image. They can be present on a local system or on a remote server also. All the entries corresponding to the files to be loaded is made in an XML file. I traverse through the nodes of the XML using a for loop. On the complete event of loader info, example:.
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onComplete);
I fill a container with the loaded data.
My problem is when I am using for loop it doesn't works properly but if i use a statement like this:
someFunc()
if(i<arr.length())
... do something...
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onComplete);
private function onComplete(e:Event):void
... do something...
i++;
All files are loaded properly.
I think this can be because the for loop processes pretty fast but the content takes time to load, which ultimately leads to some wierd results.
Please let me know how can this thing be done correctly by using a for loop also.You don't want to use a for loop to load several items. The way you almost appear to have it is the proper approach... load a file and use the completion of its loading to trigger loading the next file.
-
Hello All,
I am using data source 2LIS_11_VAHDR in BI 7.0 to load data to ODS.
I am using the full upload.
When i schedule the infopackage i dont get any records.The request status is shown as running.
So i check the job in the source system.But i Saw that Job status is Released and not active.
There were some entries already present in the extraction queue.So I tried scheduling Job to move the entries from extraction queue.But this job too is showm as released and not active.
Please let me know what can be the problem.Hi,
Check in RSA3 if you are getting records then you should be able to get it to PSA and then ODS.
If not then replicate the data source activaet the transfer structure and again laod . This will help.
Regards,
Amol -
Using FDM to load data from oracle table (Integration Import Script)
Hi,
I am using Integration Import Script to load data from oracle table to worktables in FDM.
i am getting following error while running the script.
Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done
Attaching the full error report
ERROR:
Code............................................. -2147217887
Description...................................... Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.
At line: 22
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 6260
IDENTIFICATION:
User............................................. ******
Computer Name.................................... *******
App Name......................................... FDMAPP
Client App....................................... WebClient
CONNECTION:
Provider......................................... ORAOLEDB.ORACLE
Data Server......................................
Database Name.................................... DBNAME
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... SCRTEST
Location ID...................................... 750
Location Seg..................................... 4
Category......................................... FDM ACTUAL
Category ID...................................... 13
Period........................................... Jun - 2011
Period ID........................................ 6/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
I am using the following script
Function ImpScrTest(strLoc, lngCatKey, dblPerKey, strWorkTableName)
'Oracle Hyperion FDM Integration Import Script:
'Created By: Dhananjay
'Date Created: 1/17/2012 10:29:53 AM
'Purpose:A test script to import data from Oracle EBS tables
Dim cnSS 'ADODB.Connection
Dim strSQL 'SQL string
Dim rs 'Recordset
Dim rsAppend 'tTB table append rs object
'Initialize objects
Set cnSS = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
'Connect to SQL Server database
cnss.open "Provider=OraOLEDB.Oracle.1;Data Source= +server+;Initial Catalog= +catalog+;User ID= +uid+;Password= +pass+"
'Create query string
strSQL = "Select AMOUNT,DESCRIPTION,ACCOUNT,ENTITY FROM +catalog+.TEST_TMP"
'Get data
rs.Open strSQL, cnSS
'Check for data
If rs.bof And rs.eof Then
RES.PlngActionType = 2
RES.PstrActionValue = "No Records to load!"
Exit Function
End If
'Loop through records and append to tTB table in location’s DB
If Not rs.bof And Not rs.eof Then
Do While Not rs.eof
rsAppend.AddNew
rsAppend.Fields("PartitionKey") = RES.PlngLocKey
rsAppend.Fields("CatKey") = RES.PlngCatKey
rsAppend.Fields("PeriodKey") = RES.PdtePerKey
rsAppend.Fields("DataView") = "YTD"
rsAppend.Fields("CalcAcctType") = 9
rsAppend.Fields("Amount") = rs.fields("Amount").Value
rsAppend.Fields("Desc1") = rs.fields("Description").Value
rsAppend.Fields("Account") = rs.fields("Account").Value
rsAppend.Fields("Entity") = rs.fields("Entity").Value
rsAppend.Update
rs.movenext
Loop
End If
'Records loaded
RES.PlngActionType = 6
RES.PstrActionValue = "Import successful!"
'Assign Return value
SQLIntegration = True
End Function
Please help me on this
Thanks,
Dhananjay
Edited by: DBS on Feb 9, 2012 10:21 PMHi,
I found the problem.It was because of the connection string.The format was different for oracle tables.
PFB the format
*cnss.open"Provider=OraOLEDB.Oracle.1;Data Source= servername:port/SID;Database= DB;User Id=aaaa;Password=aaaa;"*
And thanks *SH* for quick response.
So closing the thread......
Thanks,
Dhananjay -
Problem in Loading data for clob column using sql ldr
Hi,
I am having problem in loading data for tables having clob column.
Could anyone help me in correcting the below script for ctrl file inorder to load the data which is in mentioned format.
Any help really appreciated.
Table Script
Create table samp
no number,
col1 clob,
col2 clob
Ctrl File
options (skip =1)
load data
infile 'c:\1.csv'
Replace into table samp
fields terminated by ","
trailing nullcols
no,
col1 Char(100000000) ,
col2 Char(100000000) enclosed by '"' and '"'
Data File(1.csv)
1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"
Error Encountered
ORA-01461: can bind a LONG value only for insert into a LONG column
Table sampThanks in advanceI can't reproduce it on my 10.2.0.4.0. CTL file:
load data
INFILE *
Replace into table samp
fields terminated by ","
trailing nullcols
no,
col1 Char(100000000) ,
col2 Char(100000000) enclosed by '"' and '"'
BEGINDATA
1,asdf,"assasadsdsdsd""sfasdfadf""sdsdsa,ssfsf"
2,sfjass,"dksadk,kd,ss""dfdfjkdjfdk""sasfjaslaljs"Loading:
SQL> Create table samp
2 (
3 no number,
4 col1 clob,
5 col2 clob
6 );
Table created.
SQL> host sqlldr scott/tiger control=c:\temp\samp.ctl log=c:\temp\samp.log
SQL> select * from samp
2 /
NO
COL1
COL2
1
asdf
assasadsdsdsd"sfasdfadf"sdsdsa,ssfsf
2
sfjass
dksadk,kd,ss"dfdfjkdjfdk"sasfjaslaljs
NO
COL1
COL2
SQL> SY. -
Data upload problem in delta update from 1st ODS to 2nd ODS
Dear Friends,
I am loading data from one ODS to another. The update mode was full upload. Sometime back an error occurred in activation of the first ODS. The error was: Full updates already available in ODS ,Cannot update init./delta. So currently daily records are pulled but not added i.e. transferred recs = 4000 but added recs = 0.
When I looked for a solution in SDN I found that using program RSSM_SET_REPAIR_FULL_FLAG for 2nd ODS will reset all full uploads to Repair Full Request which I have already done for 2nd ODS. Then initialize once and pull delta.
But problem is that I cannot set update mode to delta as I am pulling some 80,000 records in 2nd ODS from 1st ODS with around 14 lacs records daily based on some data-selection filters in infopkg. But do not see any parameters for data-selction in delta mode.
Please suggest.
Regards,
Amit SrivastavaDear Sirs,
Due to this error in activation in 2nd ODS daily data upload is failing in 1st ODS.
To correct this I converted all full upload requests in 2nd ODS to Repair full requests.
But now when I scheduled the infopkg today with full upload again data was transferred but not added.
I know I cannot have init./ delta so what possibly can now be done in this scenario. Please help.
Regards,
Amit Srivastava -
Problem with loading data to Essbase
Hi All,
I have a problem with loading data into Essbase. I've prepared maxl script to load the data, calling rule file. The source table is located in RDBMS Oracle. The script works correctly, ie. generally loads data into Essbase.
But the problem lies in the fact, that after deletion of data from Essbase, when I'm trying to load it again from the source table I get the message: WARNING - 1003035 - No data values modified by load of this data file - although there is no data in Essbase... I've also tried to change the mode of loading data from 'overwrite' to 'add to existing values' (in rule file) but it does'nt help ... Any ideas what can I do?Below few lines from EPM_ORACLE_INSTANCE/diagnostics/logs/essbase/dataload_ODL.err:
[2013-09-24T12:01:40.480-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received Validate Login Session request
[2013-09-24T12:01:40.482-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received client request: Get App and Database Status (from user [admin@Native Directory])
[2013-09-24T12:01:54.488-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: MaxL: Execute (from user [admin@Native Directory])
[2013-09-24T12:01:54.492-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Received client request: MaxL: Describe (from user [admin@Native Directory])
[2013-09-24T12:01:54.492-10:01] [ESSBASE0] [MLEXEC-2] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1115777344] Output columns described
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Define (from user [admin@Native Directory])
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Received client request: MaxL: Fetch (from user [admin@Native Directory])
[2013-09-24T12:01:54.494-10:01] [ESSBASE0] [MLEXEC-3] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1102616896] Record(s) fetched
[2013-09-24T12:01:54.496-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1116830016] Received client request: MaxL: Fetch (from user [admin@Native Directory])
[2013-09-24T12:01:54.498-10:01] [ESSBASE0] [AGENT-1160] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1114724672] Received Validate Login Session request
[2013-09-24T12:01:54.499-10:01] [ESSBASE0] [AGENT-1001] [NOTIFICATION] [16][] [ecid:1379989900303,0] [tid:1101564224] Received client request: Get Application State (from user [admin@Native Directory]) -
Problems Generating Data Move Scripts in v1.5.1
Hi, I'm having problems when trying to generate data move scripts in SQL Dev 1.5.1 to carry out an off-line data load. I'm carrying out a migration from Sybase to Oracle and the database I'm working on has over 400 tables in it. I have successfully captured and migrated all the tables into the resp. models and have generated and created the DDL for the converted model. However, when I request the data move scripts to be generated I'm only getting ctl files created for the 1st 49 tables. Also, there is no oracle_ctl.sh script created. Also, no post_load.sql script is produced only a pre_load.sql script.
I've got 3 databases to migrate and on the 2nd database I only get the data move scripts created for ths first 86 tables and there are over 250 tables in it.
It appears to have worked better for the 3rd database which is much much smaller than the 1st two databases having only 59 tables in it. This time all the files were produced as expected. However, it's really the 1st two larger databases that are my priority to get migrated.
I've tried changing the preferences within Migration/Generation Options to from 'One Single File' to 'A File per Object' but it makes no difference. I would prefer everything in one file but can work round that.
Ideally, I'd like to generate all the ctl files for a database in one go so that I can group edit them and would prefer the tool to create the oracle_ctl.sh script to call all the ctl scripts for me rather than having to hand build it. I'm puzzled as to why the tool only creates ctl files for some of the tables contained within a converted model. it looks like it is not completing the job in these cases as it also doesn't create all the scripts that it is supposed to create either. It doesn't give out any error messages and the screen looks no different at completion to when it works successfully in the case of the very small database.
Anybody had this problem or can suggest how to fix it ?
Thanks all.Send me you phone number to [email protected]
We'll help sort this out.
Barry -
PROBLEM LOADING DATA FROM A TEXT FILE.
Hi,
Im having a problem in loading my csv file to the database. Im using Oracle Database 10g for Linux. Im in p. 228 in the book. This is my csv file look.
db_name db_version host_id
db10 9.2.0.7 1
db11 10.2.0.1 1
db12 10.2.0.1 1
db13 9.2.0.7 1
db14 10.2.0.1 1
db15 9.2.0.7 1
I loaded this data to an existing table called DATABASES loaded from tab delimited. FILE CHARACTER SET is UNICODE UTF-8. Then I browsed the name of the csv file to be uploaded. It looked like this.
File Name F23757437/db2.csv Reupload File
Separator
Optionally Enclosed By
First row contains column names.
File Character Set
I CLICKED NEXT, THIS IS WHAT IT LOOKED LIKE.
Schema: HANDSONXE06
Table Name: DATABASES
Define Column Mapping
Column Names %
Format
Upload yes
Row 1 "db10" "9.2.0.7" 1
Row 2 "db11" "10.2.0.1" 1
Row 3 "db12" "10.2.0.1" 1
Row 4 "db13" "9.2.0.7" 1
Row 5 "db14" "10.2.0.1" 1
Row 6 "db15" "9.2.0.7" 1
I CLICKED LOAD AND THIS WAS THE RESULT.
* There are NOT NULL columns in HANDSONXE06.DATABASES. Select to upload the data without an error.
Schema
Down
Table Name
Down
File Details
Down
Column Mapping
Load Data
Schema: HANDSONXE06
Table Name: DATABASES
Define Column Mapping
Column Names COLUMN_NAMES
Format FORMAT
Upload UPLOAD
Row 1 "db10" "9.2.0.7" 1
Row 2 "db11" "10.2.0.1" 1
Row 3 "db12" "10.2.0.1" 1
Row 4 "db13" "9.2.0.7" 1
Row 5 "db14" "10.2.0.1" 1
Row 6 "db15" "9.2.0.7" 1
I WAS REALLY WONDERING WHAT WAS REALLY WRONG. AN ERROR MESSAGE SAID, THERE ARE NOT NULL COLUMNS IN THE HANDSONXE06.DATABASES. I DIDN'T KNOW HOW TO FIX IT. WHAT DO I NEED TO CHANGE TO LOAD THE DATA WITHOUT AN ERROR? IT REALLY CONFUSED ME A LOT AND HOW COME I HAVE AN ERROR? PLEASE HELP ME. I NEED AND ANSWER TO MY PROBLEM PLEASE. I CANNOT GO FORWARD BECAUSE OF THIS.
THANKS,
JOCELYNI'm not certain of the utility you are using to load the data, however, I completed the following test using SQL Loader to insert the data into my table. Your process should work similar if the trigger and sequence are created for the table you are loading.
SQL> create table load_tbl
2 (db_id number(3) not null,
3 db_name varchar2(100) not null,
4 db_version varchar2(25),
5 host_id number(3) not null)
6 /
Table created.
SQL> desc load_tbl
Name Null? Type
DB_ID NOT NULL NUMBER(3)
DB_NAME NOT NULL VARCHAR2(100)
DB_VERSION VARCHAR2(25)
HOST_ID NOT NULL NUMBER(3)
SQL> create sequence db_id_seq;
Sequence created.
SQL> create or replace trigger db_id_trig
2 before insert on load_tbl
3 for each row
4 when (new.db_id is null)
5 begin
6 select db_id_seq.nextval into :new.db_id from dual;
7 end;
8 /
Trigger created.
The contents of the data file, control file and log file are below for the load into load_tbl.
C:\>sqlldr userid=username/password@db control=db_id_load.ctl log=db_id_load.log
SQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 6
C:\>
SQL> select * from load_tbl
2 /
DB_ID DB_NAME DB_VERSION HOST_ID
1 db10 9.2.0.7 1
2 db11 10.2.0.1 1
3 db12 10.2.0.1 1
4 db13 9.2.0.7 1
5 db14 10.2.0.1 1
6 db15 9.2.0.7 1
6 rows selected.
SQL>
Data File"db10" "9.2.0.7" 1
"db11" "10.2.0.1" 1
"db12" "10.2.0.1" 1
"db13" "9.2.0.7" 1
"db14" "10.2.0.1" 1
"db15" "9.2.0.7" 1
Control FileLOAD DATA
INFILE "C:\db_id_load.dat"
APPEND INTO TABLE load_tbl
FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(db_name CHAR,
db_version CHAR,
host_id "TO_NUMBER(:host_id,'99999999999')"
Log FileSQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: db_id_load.ctl
Data File: C:\db_id_load.dat
Bad File: db_id_load.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table LOAD_TBL, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
DB_NAME FIRST * WHT O(") CHARACTER
DB_VERSION NEXT * WHT O(") CHARACTER
HOST_ID NEXT * WHT O(") CHARACTER
SQL string for column : "TO_NUMBER(:host_id,'99999999999')"
Table LOAD_TBL:
6 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 49536 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 6
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Jan 18 17:21:47 2007
Run ended on Thu Jan 18 17:21:47 2007
Elapsed time was: 00:00:00.39
CPU time was: 00:00:00.13 -
Shell Script Programming -- Loading data into table
Hello Gurus
I am using Oracle's sql*loader utility to load data into a table. Lately, I got an unlikely scenario where in I need to process the data file first before loading it into the table and where I need help from you guys.
Consider the following data line
"Emp", DOB, Gender, Subject
"1",01/01/1980,"M","Physics:01/05/2010"
"2",01/01/1981,"M","Chemistry:02/05/2010|Maths:02/06/2011"
"3",01/01/1982,"M","Maths:03/05/2010|Physics:06/07/2010|Chemistry:08/09/2011"
"4",01/01/1983,"M","Biology:09/09/2010|English:10/10/2010"Employee - 1 will get loaded as a single record in the table. But I need to put Subject value into two separate fields into table. i.e. Physics into one column and date - 01/05/2010 into separate column.
Here big problem starts
Employee - 2 Should get loaded as 2 records into the table. The first record should have Chemistry as subject and date as 02/05/2010 and the next record should have all other fields same except the subject should be Maths and date as 02/06/2011. The subjects are separated by a pipe "|" in the data file.
Similarly, Employee 3 should get loaded as 3 records. One as Maths, second as Physics and third as Chemistry along with their respective dates.
I hope I have made my problem clear to everyone.
I am looking to do something in shell scripting such that before finally running the sql*loader script, the above 4 employees have their records repeated as many times as their subject changes.
In summary 2 problems are described above.
1. To load subject and date into 2 separate fields in Oracle table at the time of load.
2. If their exists multiple subjects then a record is to be loaded that many times as there exists any changes in employee's subject.
Any help would be much appreciated.
Thanks.Here are some comments. Perl can be a little cryptic but once you get used to it, it can be pretty powerful.
#!/usr/bin/perl -w
my $line_count = 0;
open FILE, "test_file" or die $!;
# Read each line from the file.
while (my $line = <FILE>) {
# Print the header if it is the first line.
if ($line_count == 0) {
chomp($line);
print $line . ", Date\n";
++$line_count;
next;
# Get all the columns (as separated by ',' into an array)
my @columns = split(',', $line);
# Remove the newline from the fourth column.
chomp($columns[3]);
# Read the fields (separated by pipe) from the fourth column into an array.
my @subject_and_date = split('\|', $columns[3]);
# Loop for each subject and date.
foreach my $sub_and_date (@subject_and_date) {
# Print value of Emp, DOB, and Gender first.
print $columns[0] . "," . $columns[1] . "," . $columns[2] . ",";
# Remove all double quotes from the subject and date string.
$sub_and_date =~ s/"//g;
# Replace ':' with '","'
$sub_and_date =~ s/:/","/;
print '"' . $sub_and_date . '"' . "\n";
++$line_count;
close FILE; -
Problem in Loading Data from SQL Server 2000 to Oracle 10g
Hi All,
I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
Where Prj_Dim_Region is the name of my target table in Oracle.
The error message is:
955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
java.sql.SQLException: ORA-00955: name is already used by an existing object
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
<%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
I have chosen to execute this code on target.
Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
But when i press OK to run it, I get this error message:
java.lang.IllegalArgumentException: Row index out of range
at javax.swing.JTable.boundRow(Unknown Source)
at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.open(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
Thank you so much in advance.
NeelHi Cezar,
Can u plz help me with this scenario?
I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
java.sql.SQLException: ORA-00942: table or view does not exist
and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
"-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
Please let me know how to make it work?
Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
Thank you for your time in advance.
Regards,
Neel
Maybe you are looking for
-
How do I get music loaded into itunes via CD synced into ipad?
sync seems to only want to load music onto the ipad that was purchased within itunes. I have a lot of old CDs have loaded into itunes on my macbook pro that have been synced to my ipod, why won't they also sync to the ipad?
-
hi, this is file to file scenario...sender side i have my file like...empno empfirst name emp last nem persno....here im sending one mesg...and it has split into n recivers depending on persno...any one can help me in this issue ...without using b
-
Registration of the trial version on the MBP?
I have a trial version (30 days) on my new macbookpro. I just ordered an education copy. Tthe original trial version is all updated. Can I use the new registration number for the trial version? or do I have to delete the trial version and reinstall a
-
Is there an easy way to move files (audio or pictures) from ios to Mac?
Eg, if I wanted to record an audio track with Garage Band on my ipad and then move it to my Mac.. Must be quick and easy. Obviously there are ways I could do it fast, like email..
-
Performance tuning on this code
Hi guys, This piece of coding is taking ages to execute, any idea how can i fine tune it? SELECT vbeln vbtyp_n vbtyp_v vbelv INTO TABLE lt_vbfa1 FROM vbfa FOR ALL ENTRIES IN lt_vbrp WHERE vbeln = lt_vbrp-vbeln AND vbtyp_n