How to Load Arabic Data from flat file using SQL Loader ?
Hi All,
We need to load Arabic data from an xls file to Oracle database, Request you to provide a very good note/step to achieve the same.
Below are the database parameters used
NLS_CHARACTERSET AR8ISO8859P6
nls_language american
DB version:-10g release 2
OS: rhel 5
Thanks in advance,
Satish
Try to save your XLS file into CSV format and set either NLS_LANG to the right value or use SQL*Loader control file parameter CHARACTERSET.
See http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005287
Similar Messages
-
Not loading from flat file using SQL*Loader
Hi,
I am trying to load from an excel file.
first i converted excel file into csv file and save it as as dat file.
in the excel file one column is salary and the data is like $100,000
while converting xls to csv the salary is changed to "$100,000 " (with quotes and a space after the amount)
after the last digit it will put a space.
in the control file of sql*loader i had given
salary "to_number('L999,999')"
my problem is the space after the salary in the dat file.---> "$100,000 "
what changes i have to make in the to_number function which is in the control file.
Please guide me.
Thanks & Regards
Salih KM
Message was edited by:
kmsalihThanks a lot Jens Petersen
It's is loading ..........
MI means miniute.
am i correct.
but i didn't get the logic behind that.
can u please explain that.
Thanks & Regards
Salih KM -
Loading transaction data from flat file to SNP order series objects
Hi,
I am an BW developer and i need to provide data to my SNP team.
Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
thanks in advance
RahulHi,
Please go through the following links:
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
Hope this helps...
Regards,
Habeeb
Assign points if helpful..:) -
Error in loading transaction dat from flat file
hi everybody
while i try to load data from flat file at time of scheduling it shows the error "check load from infosourceand its showing errorwhilw monitoring the data as follow"the correcsponding datapacket were not updated using psa" what can i do to rectify the problem
thanks in advanceHi,
Please check the data whether it is in a correct format using the "Preview" option in the infopackage.
Check whether all the related AWB objects are active or not.
Regards,
K.Manikandan. -
Reading data from flat file Using TEXT_IO
Dear Gurus
I already posted this question but this time i need some other changes .....Sorry for that ..
I am using 10G forms and using TEXT_IO for reading data from flat file ..
My data is like this :-
0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
9|2430962.89|000111111111|
1|61304.88|000014104113|
1|41961.73|000022096086|
1|38475.65|000023640081|
1|49749.34|000032133154|
1|35572.46|000033093377|
1|246671.01|000042148111|
Here each column is separated by | . I want to read all the columns and want to do some validation .
How can i do ?
Initially my requirement was to read only 2 or 3 columns so i did like this ...
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle utl_file.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
LOOP
UTL_FILE.get_line (v_handle, v_filebuffer);
IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
SELECT line_0 INTO line_0_date
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_0 INTO line_0_Purp
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 4;
SELECT line_0 INTO line_0_count
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 7;
SELECT line_0 INTO line_0_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 8;
SELECT line_0 INTO line_0_ccy
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 9;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
SELECT line_9 INTO line_9_Acc_no
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
SELECT line_9 INTO line_9_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 2;
ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
line_1_flag := line_1_flag+1;
SELECT line_1 INTO line_1_sum
FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
FROM DUAL
CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
WHERE rn = 3;
Line_1_tot := Line_1_tot + line_1_sum;
END IF;
END LOOP;
DBMS_OUTPUT.put_line (Line_1_tot);
DBMS_OUTPUT.PUT_LINE (Line_1_flag);
UTL_FILE.fclose (v_handle);
END;
But now how can i do ? Shall i use like this select Statement for all the columns ?Sorry for that ..
As per our requirement ...
I need to read the flat file and it looks like like this .
*0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
*9|2430962.89|000111111111|*
*1|61304.88|000014104113|*
*1|41961.73|000022096086|*
*1|38475.65|000023640081|*
*1|49749.34|000032133154|*
*1|35572.46|000033093377|*
*1|246671.01|000042148111|*
*1|120737.25|000053101979|*
*1|151898.79|000082139768|*
*1|84182.34|000082485593|*
I have to check the file :-
Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
Then like this for all columns i have different validation .......
Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
MY CODE IS :-
Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
IS
v_handle TEXT_IO.file_type;
v_filebuffer varchar2(500);
line_0_date VARCHAR2 (10);
line_0_Purp VARCHAR2 (10);
line_0_count Number;
line_0_sum number(12,2);
line_0_ccy Varchar2(3);
line_9_sum Number(12,2);
line_9_Acc_no Varchar2(12);
Line_1_Sum Number(12,2);
Line_1_tot Number(15,2) := 0;
Line_1_flag Number := 0;
lval number;
lacno varchar2(16);
v_file varchar2(20);
v_path varchar2(50);
LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
LC$Token VARCHAR2(100) ;
i PLS_INTEGER := 2 ;
lfirst_char number;
lvalue Varchar2(100) ;
Begin
v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
--v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
Message(lfile_name);
v_handle := TEXT_IO.fopen(lfile_name, 'r');
BEGIN
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
--Message('First Char '||lfirst_char);
IF lfirst_char = '0' Then
Loop
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
Message('VAL - '||LC$Token);
lvalue := LC$Token;
EXIT WHEN LC$Token IS NULL ;
i := i + 1 ;
End Loop;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if ;
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '9' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
Forms_DDL('Commit');
raise form_Trigger_failure;
End IF;
LOOP
TEXT_IO.get_line (v_handle, v_filebuffer);
lfirst_char := Substr(v_filebuffer,0,1);
LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
--Message('Row '||LC$Token);
IF lfirst_char = '1' Then
Null;
Else
Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
Forms_DDL('Commit');
raise form_Trigger_failure;
End if;
END LOOP;
--END IF;
END LOOP;
EXCEPTION
When No_Data_Found Then
TEXT_IO.fclose (v_handle);
END;
Exception
When Others Then
Message('Other error');
END;
I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split. -
How to load the data from flat file ( ex excel ) to Planning area directly
Hi all ,
How can i load thedata fro m flat file directly to Planning area .
PLease help me in this.
Regards,
Chandu .download one key figure data from planning book ( interactive damand plan) and made some changes and need to upload the data back to same planning book
But, may I know why you are thinking of downloading, changing and uploading for just changing the figures for a particular key figure. You can do it in the planning book itself.
However, not all the key-figures can be changed. But, what type of key-figure you are speaking here? Is it like 'Forecast' for which the value is based on other key-figures, or is like a key-figure where some manual adjustments are to be done--so that it can be manually edited? However, in both the cases, the data can be changed in the planning book only. In first case, you can change the values of dependant key-figures and in the second case, you can change the key-figures directly.
And please note that you can change the values of the key-figures only at the detailed level. So, after loading the data in the book, use drill-down option, maintain the data at the detailed level, change the figures, and automatically, this gets reflected at the higher level.
In case you are unable to change the values, go to the 'Design' mode of the book, right-click your key-figure, under "Selected Rows", uncheck "Output Only" option. In case you are unable to see that option, then you are not authorised to change that. See if you can change the authorisations by going to the "Data View" tab in planning book configuration (/n/sapapo/sdp8b), and change the value of Status to 3.
Hope your query is answered with different solutions offered by many of the sdn colleagues here.
Regards,
Guru Charan. -
Extra column to be added while importing data from flat file to SQL Server
I have 3 flat files with only one column in a table.
The file names are bars, mounds & mini-bars.
Table 'prd_type' has columns 'typeid' & 'typename' where typeid is auto-incremented and typename is bars, mounds & mini-bars.
I Import data from 3 files to prd_details table. This table has columns 'pid', 'typeid' & 'pname' where pid is auto-incremented and pname is mapped to flat files and get info from them, now i wanted the typeid info to be received from prd_type table.
Can someone please suggest me on this?You can get it as follows
Assuming you've three separate data flow tasks for three files you can do this
1. Add a new column to pipeline using derived column transformation and hardcode it to bars, mounds or mini-bars depending on the source
2. Add a look task based on prd_type table. use query as
SELECT typeid,typename
FROM prd_type
Use full cache option
Add lookup based on derived column. new column -> prd_type.typename relationship. Select typeid as output column
3. In the final OLEDB destination task map the typeid column to tables typeid column.
In case you use single data flow task you need to include a logic based on filename or something to get the hardcoded type value as bars, mounds or mini-bars in the data pipeline
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Loading from text file using Sql Loader
I need to load data from a text file into Oracle table. The file has long strings of text in the following format:
12342||||||Lots and lots of text with all kinds of characters including
^&*!#%#^@ etc.xxxxxxxxxxxxxxxxxxxxxxx
yyyyyyyyyyyyyyyyyyyyyyyyytrrrrrrrrrrrrrrrrrrr
uuuuuuuuuuuuuuuuuuurtgggggggggggggggg.||||||||
45356|||||||||||again lots and lots of text.uuuuuudccccccccccccccccccccd
gyhjjjjjjjjjjjjjjjjjjjjjjjjkkkkkkkkkkkkklllllllllllnmmmmmmmmmmmmnaaa|||||||.
There are pipes within the text as well. On the above example, the line starting with 12342 is an entire record that needs to be loaded into a CLOB column. The next record would be the 45356 one. Therefore, all records have a bunch of pipes at the end, so the only way to know where a new record starts is to see where the next number is after all the ending pipes. The only other thing I know is that there are a fixed number of pipes in each record.
Does anyone have any ideas on how I can load the data into the table either using sql loader or any other utility? Any input would be greatly appreciated. Thanks.STFF [url http://forums.oracle.com/forums/thread.jspa?messageID=1773678�]Sqlldr processing of records with embedded newline and delimiter
-
How to load the data from flat file
Hi ,
Im new to Oracle 10g . I have to upload a flat flle(notepad with , as a seperator) to the table in the database,
can you please tell me how to do it ?
Thanks
viviHi vivi,
Sure, read about the wonderful SQL*LOADER. This tool's purpose is exactly that.
Here's a link: Tahiti start page. Just type in sql*loader and off you go.
Since you did not mention your database version, I could not put a better one.
Regards,
Guido -
Error in load master data from flat file to infocube "No SID found for val"
Hi gurus
while i m loding data from ffile to infocube. getting error message "No SID found for value 'HOT ' of characteristic 0UNIT" for material name field .
in flat file one fileld having lower carrecter so i have defined formula for upper case in transfer rule.
after scheduleing message is "data was requested". but in monitor-> detail-> request, extraction status is green and transfer , processing ststus are red.
overall status is red.It appears your transfer structure and source file are not mapped correctly. Make sure your source file is in the correct format. TRy the "preview" function to make sure.
Also, is the file and your log on language the same? For instance is the file from a GErman user but you are logged in in English? You will see errors like that when you use ST (Stuck = Pieces) instead of PC (Peices in English)
To test a bad mapping, hard code the Unit to something to make sure...(to PC for instance).
Also, the error message you are getting is because of transfer structure, please re-activate it. IF it persists, try to skip the PSA and go straight to the data target.
Message was edited by: Steve Wilson
Message was edited by: Steve Wilson -
How to pass parameters to the Control file while loading the data from flat file to staging table
Thanks in advance
Hi ,
LOADDATA statement is required at the beginning of the control file.
INFILE: INFILE keyword is used to specify location of the datafile or datafiles.
INFILE* specifies that the data is found in the control file and not in an external file. INFILE ‘$FILE’, can be used to send the filepath and filename as a parameter when registered as a concurrent program.
INFILE ‘/home/vision/kap/import2.csv’ specifies the filepath and the filename.
Hope this will help you...... -
pls send ans for this
1b5595eb-fcfc-48cc-90d2-43ba913ea79f wrote:
pls send ans for this
use any text editor to eliminate the dot before loading -
How to delete entire data from MDS UI using SQL query or Using SSIS
Hi All,
Using SSIS i loaded data in MDS UI(Subcription view),I want when i execute my package next time and load data in MDS it should be blank or data should not be in UI.Hi
please checked following url to insert data in your MDM
http://msdn.microsoft.com/en-us/library/jj819772.aspx -
Error in loading data from flat file....
Hi,
While loading the data from flat file , only one column is getting populated in the ODS and after that the sixth column is getting populated with the null values. Also when I save the changes in CSV file .. its not getting saved.
Could you please tell me what could be the probable reason for this.
Also what are the points we should keep in mind while loading from flat files.
Regards,
JeetuHi,
You need to take care of -
1. you flat file structure ( left to right columns) should match with the datasource ( top to down) column to column. if you don't wish to laod some column, leave it as blank column in flat file.
2. your file should not be open when loading data
3. make sure your transfer rules & update rules are defined properly.
4. do the simulation in infopakage first & that will give you fair idea.
5. to start with load data first in PSA Only, check & then take it forward.
hope it helps
regards
Vikash -
Loading data from flat file...
Hello,
I am actually experiencing a problem where I cant load data from a flat file into a table. I have actually reverse engineered the flat file into ODI. But the thing is I don't know how to load the data that is reversed into a RDBMS table. Added, I don't know how create this RDBMS table form within ODI to be reflected on the DB. Added, don't know how to load the data from the flat file onto this table without having to add the columns in order to map between the flat file and the table.
In conclusion, I need to know how to create an RDBMS table from within ODI on the database. And how to automatically map the flat file to the DB table and load the data into the DB table.
Regards,
HossamHi Hossam,
We can used ODI procedure to create table in the DB.
Make sure you keep the column name in the table name same as the column name in FLAT FILE so that it can automatically map the column.
and regarding Loading data from FLAT File i.e. our source table is FLAT FILE till ODI 10.1.3.4 we need to manually insert the datastore since the file system cannot be reversed.
Please let me know Hossam if i can assis you further.
Thanks and Regards,
Andy
Maybe you are looking for
-
Windows shared folder asks for permanently delete
Hi, Any file from windows shared folder asks for permanently delete and after that it is not recoverable... Please help, how to enable recycle bin service??
-
I would like to downgrade my CC membership from the full version to the Photographers version
I currently have the full CC membership, paying monthly. I am not using any of the other programs other than Lightroom & Photoshop so would like to downgrade to the Photographers version please. There seems to be no way of doing this in my account se
-
On the General tab for device status it says: "This device is not present, is not working properly, or does not have all its drivers installed. (Code 24)" The only option available when I click on properties of both devices is Start. When I click on
-
Hi Experts, I am dealing with Idoc to WS scenario, where the same Idoc is having the multiple instances replicated(50 Idoc data). Depending upon the condition, the WS has to be posted to the corresponding site. We can do this by using BPM or be spl
-
Hi all, just a quick note for anyone that's interested... SMS.ac are offering $100 for flash content from developers new to their system. I've spoken to them directly, and whilst their 'mobile pods' aren't currently supported on mobiles, flash lite c