Deploy sql loader mapping
Hi,
I have a problem with OWB when I try to start a mapping (sql loader) after deploy. The problem is that when I start the execution of the mapping, the control file of the sql loader is created not in Unix (where I have defined the location of both input file and control file) but in the pc with my OWB client. For this reason the execution fail because it doesn't find the input file (that i have put in the unix machine).
Is there a configuration for the client or for the server?
Thanks
I start the control center from my pc, but with the credential of the server unix (Host, port, username, password and service name).
Is it correct or must i start the control center from the unix machine?
Thanks for the replay.
Similar Messages
-
Execute SQL*Loader mapping
Hi all,
I'm trying to execute a deployed OWB SQL*Loader mapping, using the oem_exec_template.sql script. I've got the following error:
Stage 1: Decoding Parameters
| location_name=ORA_LOC_DWH
| task_type=SQLLoader
| task_name=MAP_SA_AGGK_FEVO
Stage 2: Opening Task
declare
ERROR at line 1:
ORA-20001: Task not found - Please check the Task Type, Name and Location are
correct.
ORA-06512: at line 268
I can execute the mapping out of the OWB client and I'm also have no problems to execute a PLSQL mapping via that script.
Did anybody use this script for a SQL*Loader mapping before?
Regards UweHi Jean-Perre,
the names of the location and the mapping should be OK. Only the mapping STEP_TYPE seems to be different (UPPERCASE) to the one which is used inside of your script.
OMB+> OMBRETRIEVE ORACLE_MODULE 'ORA_DWH_SA' GET REF LOCATION
ORA_LOC_DWH
OMB+> OMBCC 'ORA_DWH_SA'
Context changed.
OMB+> OMBLIST MAPPINGS
MAP_SA_AGGK_FEVO MAP_SA_AGGK_KK_KONTO MAP_SA_AGGK_KK_KUNDE MAP_SA_BCV_YT
OMB+> OMBRETRIEVE MAPPING 'MAP_SA_AGGK_FEVO' GET PROPERTIES (STEP_TYPE)
SQLLOADER
The mapping is deployed, otherwise i couldn't execute the mapping out of the OWB client.
Regards Uwe -
How do I incorporate the 'WHEN' operator in a SQL*Loader mapping?
Environment:
OWB 10g
Repository: 9.2.0.4
Target: 9.2.0.4
O/S: AIX 5.2
I have the need to incorporate the WHEN clause in a flat file mapping to eliminate some unneeded rows in the flat file.
I can't seem to find the configuration option or property setting or whatever it takes to get that done.
I thought it would be part of the FILTR operator but I kept getting an 'Invalid expression' error message.
In this case I want to ignore any rows that have a 'M', 'D' or 'S' in the first column.
Many thanks for all your help.
GaryJean-Pierre
Thanks very much for your quick response.
One last point and we should drop this in favor of other more urgent issues we both ahve to deal with.
The data is coming from a comma delimited (CSV) file using a comma ',' and optional quotes (") as field separators.
The first column of 'good' data is read as an INTEGER EXTERNAL because of course it is a number. However, the 'bad' rows I want to eliminate have character text in them where I would normally find numbers and they all start with 'D','M' or 'S'.
I don't have an actual column in the data file or resulting table definition that represents that first character that I'm trying to test on. Hence my use of (1) to reporesent the 1st character of data on the line regardless of whether its numeric or character.
As I stated, the syntax works fine in SQL*Loader when I typed the WHEN clause in manually.
I guess if there was a way to define a pseudo column that could be defined using the POSITION notation and everything else using the variable length delimited notation I could test on that psuedo column. I don't want the pseudo column to appear in my resulting table so that seems to be an issue. Enough.
Since I've worked around it using external tables for this issue I'm not going to spend any more time on it today.
As usual, many thanks for your help.
Have a great day! I'll be back soon with another issue :-)>
Gary -
Sql*loader map multiple files to multiple tables
Can a single control file map multiple files to multiple different tables? If so, what does the syntax look like? I've tried variations of the following, but haven't hit the jackpot yet.
Also, I understand that a direct load will automatically turn off most constraint checking. I'd like to turn this back on when I'm done loading all tables. How/when do I do that? I can find multiple references to 'REENABLE DISABLED CONSTRAINTS', but I don't know where to say that.
TIA.
LOAD DATA
INFILE 'first.csv'
TRUNCATE
INTO TABLE first_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(a,b,c)
INFILE 'second.csv'
TRUNCATE
INTO TABLE second_table
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(x,y,z,xx,yy,zz)
etc.Here you go what you want..
http://www.psoug.org/reference/sqlloader.html
LOAD DATA
INFILE 'c:\temp\demo09a.dat'
INFILE 'c:\temp\demo09b.dat'
APPEND
INTO TABLE denver_prj
WHEN projno = '101' (
projno position(1:3) CHAR,
empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
INTO TABLE orlando_prj
WHEN projno = '202' (
projno position(1:3) CHAR,
empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
INTO TABLE misc_prj
WHEN projno != '101' AND projno != '202' (
projno position(1:3) CHAR,
empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
Thanks
Aravindh -
Default extension of source file in sql loader map
Hi
I created a map having a flat file as source. The file name is X. When Ran the map, these was error message said that 'Cannot find file X.dat'
Is the '.dat' a default extension when no extension is defined?
Thank you
Sozy>
Is the '.dat' a default extension when no extension is defined?
Yes -
SQL*Loader completed with ORA error
Hi all,
i'm trying to load a flat file via a SQL Loader mapping. Even though it produces rejected rows, the mapping completed successfully!? How could I configure a SQL Loader mapping to return an error after a data error occurs?
Regards UweUwe,
I just performed some tests, and the following is the case:
- The maximum number of errors specifies how many errors you allow. If that number is reached, then no more records will be loaded.
- The deployment manager always shows finished successfully (this is not good, so I filed bug 3569480 for that).
- The RAB shows the correct status.
Does that confirm your experiences?
Thanks,
Mark. -
OWB 10gR2 : How to configure ctl and log locations for Sql*Loader mappings?
Hi all,
I'm using OWB 10gR2 to load data in tables with Sql*Loader mappings.
In my project I have a datafile module and an Oracle module.
When creating an sql*loader mapping in the oracle module, there is two properties for this mappings that I want to modify. The first is Control File Location and the second is Log File Location. Values for those properties are equal to the data file module location. When trying to change those values I can only chose "Use module configuration location".
Somebody knows how to configure those properties with different locations as the one of the flat file module?
What I want to do is to store the data file in one directory, and control file and log file in other directories.
Thank you for your help.
BernardHi,
You're right, my problem is that the dropdown only show the location associated with the flat file location even if I have other file locations created in the design repository.
The good news is that I have found the solution to solve the problem :
1) Edit the file module and in tab "Data locations", add the locations you want to use for control file and log file.
2) Open configuration window of the mapping and then the dropdown for properties Control File Location and Log File Location show new locations
I have tested my mapping after changing those properties and it's working.
Bernard -
OWB 9.0.4 :SQL*Loader: Operator POSTMAPPING does not support
Hi,
While trying to poulate Analytical Workspace using WB_LOAD_OLAP_CUBE, I got the following validation error
The analysis of the mapping is not successful under all supported languages and operating modes. Detail is as follows:
SQL*Loader: Operator POSTMAPPING does not support SQL*Loader generation.
ABAP: Operator AWPARAMS does not support ABAP generation.
I dont know what that means. Your help will be appreciated. Do I need apply some post 9.2.0.3 patch?. If yes, please let me know the patch number if available.
FYI: I am using Oracle9i with 9.2.0.3 patch
Thanks
PanneerPanneer,
Does the regular process load from a flat file into a table? This would be implemented as SQL loader mapping... in which case a PL/SQL call cannot be implemented.
What you could do:
- Use an external table to read from the flat file.
- Use the transformation in a process flow. I.e. you first execute the SQL loader mapping and then execute the transformation.
Mark. -
How do i map one field to another in control file via SQL Loader
Can someone please reply back to this question
Hi,
I have a flat file (student.dat delimiter %~| ) using control file (student.ctl) through sql loader. Here are the details.
student.dat
student_id, student_firstname, gender, student_lastName, student_newId
101%~|abc%~|F %~|xyz%~|110%~|
Corresponding table
Student (
Student_ID,
Student_FN,
Gender,
Student_LN
Question:
How do i map student_newId field to student_id field in STUDENT DB table so that new id should be inserted in student_id column. How do i specify the mapping in control file. I dont want to create a new column in student table. Please let me know the best way to do this.
Can someone please reply back to this question.
My approach:
In control file i will sepecify the below, Is this a best approach?. Do we have any othe way?
STUDENT_ID *(:STUDENT_NEWID)*,
STUDENT_FN,
GENDER,
STUDENT_LNAME,
STUDENT_NEWID BOUNDFILLER
Thanks
Sunil
Edited by: 993112 on Mar 13, 2013 12:28 AM
Edited by: 993112 on Mar 13, 2013 12:30 AM
Edited by: 993112 on Mar 13, 2013 12:31 AM
Edited by: 993112 on Mar 18, 2013 2:52 AMOK, ok...
Here is the sample data:
101%~|abc%~|F %~|xyz%~|110%~|
102%~|def%~|M %~|pqr%~|120%~|
103%~|ghi%~|M %~|stu%~|130%~|
104%~|jkl%~|F %~|vwx%~|140%~|
105%~|mno%~|F %~|yza%~|150%~|Here is the control file:
LOAD DATA
INFILE student.dat
TRUNCATE INTO TABLE STUDENT
FIELDS TERMINATED BY '%~|' TRAILING NULLCOLS
student_old FILLER
, student_fn
, gender
, student_ln
, student_id
)And here is the execution:
SQL> CREATE TABLE student
2 (
3 student_id NUMBER
4 , student_fn VARCHAR2 (10)
5 , gender VARCHAR2 (2)
6 , student_ln VARCHAR2 (10)
7 );
Table created.
SQL>
SQL> !sqlldr / control=student.ctl
SQL*Loader: Release 11.2.0.3.0 - Production on Tue Mar 19 14:37:31 2013
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 5
SQL> select * from student;
STUDENT_ID STUDENT_FN GENDER STUDENT_LN
110 abc F xyz
120 def M pqr
130 ghi M stu
140 jkl F vwx
150 mno F yza
SQL>:p -
Change the mapping generation code from sql*loader to pl/sql
I want to use a mapping with a flat file operator to generate pl/sql code even if a mapping generate sql*loader code as default.
I tried to change the Language generation property of the mapping but an API8548 error message is shown. The suggested solution by OWB is to change the language generation code in the property inspector of the mapping.
I can't use external table because I have to work with a remote machine.
What i have to do to change the generation code from SQL*Loader to PL/SQL?How about breaking this out into 2 mappings? In the first mapping, map a flat file operator to an table using SQL*Loader code. Then define a second mapping using the table as source and therefore generate PL/SQL. Then use process flow to launch the 2nd map to run after completion of first.
-
SQL Loader error number 1 in OWB 9.0.4
I'm trying to execute a flat-file to database table mapping (using SQL*Loader), having set up my locations & connectors. When I execute the mapping, I get the following message:
Starting Execution COM_COM_MT_STG
Starting Task COM_COM_MT_STG
RPE-1013-SQL_LOADER_ERROR SQL Loader reported error condition, number 1.
Completing Task COM_COM_MT_STG
Completing Execution COM_COM_MT_STG
What does that mean and how do I fix it?
Also, is there any detailed documentation on building such a mapping, that can take you through all the steps, from creating the flat file module to deployment?
Thanks,
CareyCarey,
Use the runtime audit browser to access the error log. It will show you what went wrong. You may have to re-register your flat file location with a folder separator at the end... This is a bug.
Thanks,
Mark. -
We have an external procedure running fine on 8.1.7 on VMS. After compiling and linking succesfully under 10.1.0, I get ORA-06521 PL/SQL: Error mapping function and ORA-06522: ERROR - vms_dlsym for file x, where x in the filename of the linked executable. Another external procedure that does not connect to the 10.1.0 database runs fine. What could be causing this error in Server 10.1.0 on VMS?
Thanks,
DaveHere is the code to create the function:
CREATE OR REPLACE FUNCTION f1
(h_file_name IN VARCHAR2)
RETURN BINARY_INTEGER
IS EXTERNAL
LIBRARY l1
NAME "f1"
LANGUAGE C
WITH CONTEXT
PARAMETERS
(CONTEST,
h_file_name string);
Here is the beginning of the Pro*C:
int f1(epctx, h_file_name)
OCIExtProcContext *epctx;
char h_file_name[70];
char h_line_txt [251];
int lineno;
FILE *fptr;
/* register the connection context ... */
EXEC SQL REGISTER CONNECT USING :epctx;
The function loads a flat file into the database. It is probably not related but are unable to SQLPLUS/ or SQLLDR/ into the database from an OS autheniticated account (get ORA-12547: TNS:lost contact.) Thanks for taking the time to look at this. There aren't many people trying this on VMS, I'd bet. -
Different log file name in the Control file of SQL Loader
Dear all,
I get every day 3 log files with ftp from a Solaris Server to a Windows 2000 Server machine. In this Windows machine, we have an Oracle Database 9.2. These log files are in the following format: in<date>.log i.e. in20070429.log.
I would like to load this log file's data to an Oracle table every day and I would like to use SQL Loader for this job.
The problem is that the log file name is different every day.
How can I give this variable log file name in the Control file, which is used for the SQL Loader?
file.ctl
LOAD DATA
INFILE 'D:\gbal\in<date>.log'
APPEND INTO TABLE CHAT_SL
FIELDS TERMINATED BY WHITESPACE
TRAILING NULLCOLS
(SL1 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL2 char,
SL3 DATE "Mon DD, YYYY HH:MI:SS FF3AM",
SL4 char,
SL5 char,
SL6 char,
SL7 char,
SL8 char,
SL9 char,
SL10 char,
SL11 char,
SL12 char,
SL13 char,
SL14 char,
SL15 char)
Do you have any better idea about this issue?
I thought of renaming the log file to an instant name, such as in.log, but how can I distinguish the desired log file, from the other two?
Thank you very much in advance.
Giorgos BaliotisI don't have a direct solution for your problem.
However if you invoke the SQL loader from an Oracle stored procedure, it is possible to dynamically set control\log file.
# Grant previleges to the user to execute command prompt statements
BEGIN
dbms_java.grant_permission('bc4186ol','java.io.FilePermission','C:\windows\system32\cmd.exe','execute');
END;
* Procedure to execute Operating system commands using PL\SQL(Oracle script making use of Java packages
CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "Host" AS
import java.io.*;
public class Host {
public static void executeCommand(String command) {
try {
String[] finalCommand;
finalCommand = new String[4];
finalCommand[0] = "C:\\windows\\system32\\cmd.exe";
finalCommand[1] = "/y";
finalCommand[2] = "/c";
finalCommand[3] = command;
final Process pr = Runtime.getRuntime().exec(finalCommand);
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_in = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String buff = null;
while ((buff = br_in.readLine()) != null) {
System.out.println("Process out :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process output.");
ioe.printStackTrace();
}).start();
new Thread(new Runnable() {
public void run() {
try {
BufferedReader br_err = new BufferedReader(new InputStreamReader(pr.getErrorStream()));
String buff = null;
while ((buff = br_err.readLine()) != null) {
System.out.println("Process err :" + buff);
try {Thread.sleep(100); } catch(Exception e) {}
catch (IOException ioe) {
System.out.println("Exception caught printing process error.");
ioe.printStackTrace();
}).start();
catch (Exception ex) {
System.out.println(ex.getLocalizedMessage());
public static boolean isWindows() {
if (System.getProperty("os.name").toLowerCase().indexOf("windows") != -1)
return true;
else
return false;
* Oracle wrapper to call the above procedure
CREATE OR REPLACE PROCEDURE Host_Command (p_command IN VARCHAR2)
AS LANGUAGE JAVA
NAME 'Host.executeCommand (java.lang.String)';
* Now invoke the procedure with an operating system command(Execyte SQL-loader)
* The execution of script would ensure the Prod mapping data file is loaded to PROD_5005_710_MAP table
* Change the control\log\discard\bad files as apropriate
BEGIN
Host_Command (p_command => 'sqlldr system/tiburon@orcl control=C:\anupama\emp_join'||1||'.ctl log=C:\anupama\ond_lists.log');
END;Does that help you?
Regards,
Bhagat -
SQL*Loader does not recognise the \ in directory
Hi there,
We have version OWB 9.2.0.2.8 and I am trying to run SQL*Loader. I have tried to use an external table and also got "cannot find file" which I suspect could be the same problem. I then tried to load the data with SQL*Loader and saw that the directory specification of the source data file has "mysteriously" lost the \'s.
Below is a copy of the .log file with data file specified incorrectly. It should be u:\bi\data\ocean_shipment.csv. Any ideas?
Data File: u:biocean_shipment.csv
Bad File: /opt/oracle/product/OWB/9.2.0/owb/temp/u:biocean_shipment.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 200 rows, maximum of 50000 bytes
Continuation: none specified
Path used: Conventional
Table "DWHSTG"."S_SHIPMENT_TYPES", loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
"SHIPMENT_CD" 1 * , O(") CHARACTER
"SHIPMENT_NAME" NEXT * , O(") CHARACTER
"LOAD_DATE" SYSDATE
SQL*Loader-500: Unable to open file (u:biocean_shipment.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.Jean-Pierre,
That was the first thing I checked - even put it in (with \'s on the configuration parameters of the mapping). ALso unregistered and re-registered the location, ensured I put the slashes in, but to no avail. But I do belief there might be a bug outstanding for the location with directory separators - will check on metalink and let you know. -
SQL*Loader job exits unexpectedly and causes table to be locked with NOWAIT
I have a weekly report job that I run where I have to load about 48 logs with about 750k rows of data in each log. To facilitate this, we have been using a Java job that runs SQL*Loader as an external Process (using ProcessBuilder), one after the other. Recently however, this process has been terminating abnormally during the load which is causing a lock on the table and basically causes the process to grind to a halt until we can open a ticket with the DB team to kill the session that is hung. Is there maybe a better way to handle this upload process than using SQL*Loader or is there some change I could make in either the control file or command line to stop it from dying a horrible death?
At the start of the process, I truncate the table that I'm loading to and then run this command line with the following control file:
COMMAND LINE:
C:\Oracle\ora92\BIN\SQLLDR.EXE userid=ID/PASS@DB_ID load=10000000 rows=100000 DIRECT=TRUE SKIP_INDEX_MAINTENANCE=TRUE control=ControlFile.ctl data=logfile.log
CONTROL FILE:
UNRECOVERABLE
Load DATA
INFILE *
Append
PRESERVE BLANKS
INTO TABLE MY_REPORT_TABLE
FIELDS TERMINATED BY ","
filler_field1 FILLER char(16),
filler_field2 FILLER char(16),
time TIMESTAMP 'MMDDYYYY-HH24MISSFF3' ENCLOSED BY '"',
partne ENCLOSED BY '"',
trans ENCLOSED BY '"',
vendor ENCLOSED BY '"' "SUBSTR(:vendor, 1, 1)",
filler_field4 FILLER ENCLOSED BY '"',
cache_hit_count,
cache_get_count,
wiz_trans_count,
wiz_req_size,
wiz_res_size,
wiz_trans_time,
dc_trans_time,
hostname ENCLOSED BY '"',
trans_list CHAR(2048) ENCLOSED BY '"' "SUBSTR(:trans_list, 1, 256)",
timeouts,
success ENCLOSED BY '"'
Once all of the logs have finished loading, I rebuild the indexes on the table and then start the report process. It seems like it's just dying on random logs now, re-running the process it will fail at a different point each time.
EDIT: The reasons for the UNRECOVERABLE and SKIP_INDEX_MAINTENANCE are to speed the load up. As it is, it still can take 7-12 minutes for each log to load, it's even worse without those on. Overall it's taking about 18 hours for this process to run from start to finish.
Edited by: user6676140 on Jul 7, 2011 11:37 AMPlease note that my post stated: "I have opened a ticket with Oracle support. after 6 days have not had the help that I need."
I also agree that applying the latest PSU is a Best Practice, which Oracle defines as "a cumulative collection of high impact, low risk, and proven fixes for a specific product or component".
With that statement I feel there should not be the drastic issues that we have seen. Our policy is to always apply PSUs, no matter what the product or component, without issue.
Except for now. We did our research, and only open an Oracle ticket when we need expert help. That has not been forthcoming from them, but we are still working the ticket.
Hence, I opened this forum because many times I have found help here, where others have faced the same issue and now have an insight. When having a serious problem I like to use all of my resources, this forum being one of those.
To restate the question:
(1) 97% of our databases reside on RAC. From the Search List for Databases, we do not see the columns Sessions:CPU, Sessions: I/O, Sessions: Other, Instance CPU%, and are told this is working as designed because you must monitor the instance, not the database, with RAC.
(a) After applying PSU2 the Oracle Load Map no longer showed any databases.
All of this in (1) is making the tool less useful for monitoring at the database level, which we do most of the time.
(2) Within a few days of applying PSU2, we couldn't log into EM and got the error "Authentication failed. If problem persists, contact your system administrator."
(b) searching through the emoms.trc files we found the errors listed above posting frantically.
After rolling back PSU we are back in business.
However, there is still the need to remain current with the components of EM.
I am looking for suggestion, insights, experience. While I appreciate Akanksha answering so quickly, a recommendation to open an SR is not what I need.
Sherrie
Maybe you are looking for
-
Running 2 copies of VS 2012 Pro at the same time -- question
I have two projects that I want to work on at the same time (neither shares files from the other so they are independent) -- one is actually a class library (DLL) and the other is an application, but the application doesn't point to the compiled DLL
-
Adding a new field to field catalog
Hi, I need to add the field bwart to the catalogue field, I have no idea how to do it. I have read it is done in t. code NACE, but when I get there I don't know what to do. Thanks, Alejandro
-
Why don't my pdf forms import into Word or Pages properly?
Why is it that when I create a form in Acrobat Pro, fill it out and save it as a pdf file, and then import or place the saved file in either a Word or Pages document that ONLY the form fields are visible and not the filled in data? I've tried everyt
-
Had to revert to previous version of Flash to regain HD in video.
I dual boot Xp and Win 7. Win 7 is fine but after updating to flash version 10.2.153.1 in Xp the high definition videos became unwatchable (herkyjerky). I reinstalled everything...drivers etc., even reinstalling XP but it was useless. As a last re
-
There was a power otuage and my laptop went black after the power outage. The battery was not inside the laptop when the power outage occured. When the power outage ended, I powered back on my laptop but it won't been nor will it boot onto the Lenovo