Issue loading CSV file into HANA
Hi,
From last couples of weeks i am trying to load my CSV file into HANA Table, but i am unable to succeed.
I am getting error "Cannot open Control file, /dropbox/P1005343/CRM_OBJ_ID.CTL". I have followed each and every step in SDN, still I could not load data into my HANA table.
FTP: /dropbox/P1005343
SQL Command:
IMPORT FROM '/dropbox/P1005343/crm_obj_id.ctl'
Error:
Could not execute 'IMPORT FROM '/dropbox/P1005343/crm_obj_id.ctl''
SAP DBTech JDBC: [2]: general error: Cannot open Control file, /dropbox/P1005343/crm_obj_id.ctl
Please help me on this
Regards,
Praneeth
Hi All,
I have successfully loaded the file into HANA database in folder P443348 but while importing file, I am getting the following error message such as
SAP DBTECH JDBC: [2] (at 13) : general error: Cannot open Control file, "/P443348/shop_facts.ctl"
This is happening while I am executing the following import statement
IMPORT FROM '/P443348/shop_facts.ctl';
I have tried several options including changing the permissions of the folders and files to no success. As of now my folder has full access, which is 777
Any help would be greatly appreciated so that I can proceed further
Thanks
Vamsidhar
Similar Messages
-
How to load CSV files into HANA
1.- Create an CSV file with your data
2.- Copy the file to dropbox/yourname inside the HANA box
3.- Create a table in HANA with the structure of your file
4.- create the control file BBB with the following information:
import data
into table XXX."YYY"
from 'ZZZ.csv'
record delimited by '\n'
fields delimited by ','
optionally enclosed by '"'
error log 'Text_Tables.err'
Where XXX is your schema, YYY is your HANA Table and ZZZ is your file
5.- Open an Script File and write the following:
LOAD FROM '/filer/dropbox/yourname/BBB.ctl';
where BBB is the name of your control file
Greetings,
Blag.With the recent upgrade to the HANA SP3...the method to upload files to HANA has been slightly changed...so here's the updated tutorial:
1.- Create an CSV file with your data
2.- Copy the file to the HANA Dev Center
3.- Open a FTP connection to ftp.sapdevcenter.com
(You can find the user name and password by doing a select to SYSTEM.FTP_SERVER)
and....*Don't paste the Username or password here or any other place!*
4.- Create a folder with your name on the FTP site and copy the file there
4.- Create a table in HANA with the structure of your file
5.- create the control file BBB with the following information:
import data
into table XXX."YYY"
from 'ZZZ.csv'
record delimited by '\n'
fields delimited by ','
optionally enclosed by '"'
error log 'Text_Tables.err'
Where XXX is your schema, YYY is your HANA Table and ZZZ is your file
5.- Open an Script File and write the following:
IMPORT FROM '/dropbox/yourname/BBB.ctl';
where BBB is the name of your control file
I just test it and it works...if it doesn't work for you, please let me know
Greetings,
Blag. -
How do i import my local csv files into HANA database as temporary tables using java program?
I want to import my local csv file into database for further apply Join with other tables programmatic in JAVA
Hi Vivek,
Please go through the following blogs and video to resolve your issue.
Loading large CSV files to SAP HANA
HANA Academy - Importing Data using CTL Method - YouTube
Hope it helps.
Regards
Kumar -
Load CSV file into single CLOB
Hello Oracler,
is there a good way to load a csv file into PL/SQL a Clob variable ?
Iljahttp://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978
-
Load CSV file into a table when a button is clicked by the user
Hello,
Can anyone please help me out with this issue, I have a form where in a user comes and uploads a CSV file and clicks a button, when the button is clicked - it should load the CSV file data into the database table for the corresponding columns.
Can anyone please suggest me a possible solution or an approach.
Thanks,
Orton
Edited by: orton607 on May 5, 2010 2:00 PMthanks fro replying.
I have tried your changes but its not working. One more question is that I am having one column which contains commas, when I tried to load the file its failing. I think its the problem with commas. So I have changed the code to use the replace function for that column, then also its not working. Can anyone please suggest a possible approach. Below is my source code for your reference.
DECLARE
v_blob_data BLOB;
v_blob_len NUMBER;
v_position NUMBER;
v_raw_chunk RAW(10000);
v_char CHAR(1);
c_chunk_len NUMBER := 1;
v_line VARCHAR2 (32767):= NULL;
v_data_array wwv_flow_global.vc_arr2;
v_rows NUMBER;
v_sr_no NUMBER := 1;
l_cnt BINARY_INTEGER := 0;
l_stepid NUMBER := 10;
BEGIN
delete from sample_tbl;
-- Read data from wwv_flow_files</span>
select blob_content into v_blob_data from wwv_flow_files
where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);
v_blob_len := dbms_lob.getlength(v_blob_data);
v_position := 1;
-- Read and convert binary to char</span>
WHILE ( v_position <= v_blob_len ) LOOP
v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
v_line := v_line || v_char;
v_position := v_position + c_chunk_len;
-- When a whole line is retrieved </span>
IF v_char = CHR(10) THEN
-- Convert comma to : to use wwv_flow_utilities </span>
v_line := REPLACE (v_line, ',', ':');
-- Convert each column separated by : into array of data </span>
v_data_array := wwv_flow_utilities.string_to_table (v_line);
-- Insert data into target table </span>
EXECUTE IMMEDIATE 'insert into sample_tbl(col1..col12)
values (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,:12)'
USING
v_sr_no,
v_data_array(1),
v_data_array(2),
v_data_array(3),
v_data_array(4),
v_data_array(5),
v_data_array(6),
v_data_array(7),
v_data_array(8),
REPLACE(v_data_array(9), ':', ','),
to_date(v_data_array(10),'MM/DD/YYYY'),
v_data_array(11);
-- Clear out
v_line := NULL;
v_sr_no := v_sr_no + 1;
l_cnt := l_cnt + SQL%ROWCOUNT;
END IF;
END LOOP;
COMMIT;
l_stepid := 20;
IF l_cnt = 0 THEN
apex_application.g_print_success_message := apex_application.g_print_success_message || ' Please select a file to upload ' ;
ELSE
apex_application.g_print_success_message := apex_application.g_print_success_message || 'File uploaded and processed ' || l_cnt || ' record(s) successfully.';
END IF;
l_stepid := 30;
EXCEPTION WHEN OTHERS THEN
ROLLBACK;
apex_application.g_print_success_message := apex_application.g_print_success_message || 'Failed to upload the file. '||REGEXP_REPLACE(SQLERRM,'[('')(<)(>)(,)(;)(:)(")('')]{1,}', '') ;
END;
Below is the function which I am using to convert hex to decimal
create or replace function hex_to_decimal( p_hex_str in varchar2 ) return number
is
v_dec number;
v_hex varchar2(16) := '0123456789ABCDEF';
begin
v_dec := 0;
for indx in 1 .. length(p_hex_str)
loop
v_dec := v_dec * 16 + instr(v_hex,upper(substr(p_hex_str,indx,1)))-1;
end loop;
return v_dec;
end hex_to_decimal;
thanks,
Orton -
How to load .csv file into an abap table
Hi,
Pls provide me with a sample code on how to load a .csv file (with column header and rows of data) into a table in sap.
Thank you!
Moderator Message: A google search would yield faster results, so search!
Edited by: Suhas Saha on Jan 16, 2012 5:30 PMHi,
Using GUI_upload convert file to internal table data.
Using the below statement separate header and data from CSV.
Read t_itab into wa_titab index 1.
delete t_itab from wa_itab.
Create an internal table with the structure same as 'CSV' file and use the below statement to separate.
loop at t_itab into wa_itab.
split wa_itab-rec at ',' into var1 var2 var3.
wa_output-v1 = var1.
wa_output-v2 = var2.
wa_output-v3 = var3.
append wa_output to t_output.
endloop.
Edited by: syamsundr on Jan 16, 2012 11:54 AM -
Issue Loading CSV file using OPENROWSET
Hi All,
I have a csv file that has a value like -200 when i tried open the file in SQL using OPENROWSET that value is read as null.
select * from
OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Text;Database=C:\TEST\;',
'SELECT * FROM ABC.csv')
when i save teh fiel as .xlsx and open using the OPENROWSET syntax for xlsx file it shows the correct value.
I am not sure if it is the behaviour of teh file or withe SQL.Can some one tell me what would be the possible reasons for that.
Thanks in advance.
RaghavCheck this link and sample:
http://www.databasejournal.com/features/mssql/article.php/10894_3331881_2
select * from OpenRowset('MSDASQL', 'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=C:\External;','select top 6 * from
MyCsv.csv')
Best Wishes, Arbi; Please vote if you find this posting was helpful or Mark it as answered. -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Example for loading a csv file into diadem from a labview application
Hi everyone, i'm using labview 8.2 and DIAdem 10.1.
I've been searching in NI example finder but I had no luck so far.
I have already downloaded the labview connectivity VIs.
Can anyone provide a example that can help me loading a csv file into diadem from a labview application?
ThanksHi Alexandre.
I attach an example for you.
Best Regards.
Message Edité par R_Duval le 01-15-2008 02:44 PM
Romain D.
National Instruments France
#adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
NIDays 2010 : Conférence mondiale de l'instrumentation virtuelle
>>Détails et Inscription<<
Attachments:
Classeur1.csv 1 KB
Load CSV to Diadem.vi 15 KB -
Getting Issue while uploading CSV file into internal table
Hi,
CSV file Data format as below
a b c d e f
2.01E14 29-Sep-08 13:44:19 2.01E14 SELL T+1
actual values of column A is 201000000000000
and columen D is 201000000035690
I am uploading above said CSV file into internal table using
the below coding:
TYPES: BEGIN OF TY_INTERN.
INCLUDE STRUCTURE KCDE_CELLS.
TYPES: END OF TY_INTERN.
CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
EXPORTING
I_FILENAME = P_FILE
I_SEPARATOR = ','
TABLES
E_INTERN = T_INTERN
EXCEPTIONS
UPLOAD_CSV = 1
UPLOAD_FILETYPE = 2
OTHERS = 3.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
am getting all columns data into internal table,
getting problem is columan A & D. am getting values into internal table for both; 2.01E+14. How to get actual values without modifying the csv file format.
waiting for your reply...
thanks & regards,
abhiHi Saurabh,
Thanks for your reply.
even i can't double click on those columns.
b'se the program needs be executed in background there can lot of csv file in one folder. No manual interaction on those csv files.
regards,
abhi -
1. Is there a way to import large XML files into HANA efficiently?
2. Will it process it node by node or the entire file at a time?
3. Are there any data services provided to do this?
This for a project use case i also have an requirement to process bulk XML files, suggest me to accomplish this taskHi Patrick,
I am addressing the similar issue. "Getting data from huge XMLs into Hana."
Using Odata services can we handle huge data (i.e create schema/load into Hana) On-the-fly ?
In my scenario,
I get a folder of different complex XML files which are to be loaded into Hana database.
Then I gotta transform & cleanse the data.
Can I use oData services to transform and cleanse the data ?
If so, how can I create oData services dynamically ?
Any help is highly appreciated.
Thank you.
Regards,
Alekhya -
Importing a csv file into addressbook
So I did a search & found this question has come up quite often, but with the person having a slightly different problem, so here goes.
I am trying to import a csv file into address book, when I go import -> text tile it comes up like it should do and I pick the file, if I select ok at this point, it will quite happily import all of the data no trouble (but in the wrong categories). So I make the changes to what I want ie. Phone other --> Phone (work). Then when I click ok, the button goes from blue to white but nothing happens.
When I open the csv file in textedit I can see all the values separated by commas.
Any Ideas?I too just had this problem. To clarify all fields work and you can map them. I found that the Outlook CSV file had two problems which were tripping up the AddressBook import function. Note these were both in street address fields so I suspect the poster who said that addresses don't work may have been having that problem.
The CSV load program does not work if you have commas "," in your data. In my case this was very common for some European addresses as well as some company names. And then the second problem was if there was a carriage return or new line character in a field. This later one was a problem with the address fields. I suspect an AddressBook export to CSV might have this same issue. Specifically, when there were multiple lines to the street address, they had been entered as multiple lines in street address one, as opposed to entering them in multiple fields such as "street 1", "street 2", etc.
To solve the problem I opened the CSV file using Numbers and did a global find and replace on: commas, substituting the string --- and on the newline character "¶" (ALT+Rreturn) with *.
Note to reduce some of the remapping I also changed all of the labels with "business" in the header row of the CSV file to "work". This made it a little easier for the AddressBook import function to map fields more correctly.
After these set of changes rather than the alternating blue then white with nothing behavior it cleanly loaded all of my names. I could then search for "---" and "*" in addressbook's search field to identify all addresses which I had to clean up.
I hope that is helpful to someone else. The import function really does work. However, it is not robust enough to deal with certain oddities in one's data. -
How to read/write .CSV file into CLOB column in a table of Oracle 10g
I have a requirement which is nothing but a table has two column
create table emp_data (empid number, report clob)
Here REPORT column is CLOB data type which used to load the data from the .csv file.
The requirement here is
1) How to load data from .CSV file into CLOB column along with empid using DBMS_lob utility
2) How to read report columns which should return all the columns present in the .CSV file (dynamically because every csv file may have different number of columns) along with the primariy key empid).
eg: empid report_field1 report_field2
1 x y
Any help would be appreciated.If I understand you right, you want each row in your table to contain an emp_id and the complete text of a multi-record .csv file.
It's not clear how you relate emp_id to the appropriate file to be read. Is the emp_id stored in the csv file?
To read the file, you can use functions from [UTL_FILE|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/u_file.htm#BABGGEDF] (as long as the file is in a directory accessible to the Oracle server):
declare
lt_report_clob CLOB;
l_max_line_length integer := 1024; -- set as high as the longest line in your file
l_infile UTL_FILE.file_type;
l_buffer varchar2(1024);
l_emp_id report_table.emp_id%type := 123; -- not clear where emp_id comes from
l_filename varchar2(200) := 'my_file_name.csv'; -- get this from somewhere
begin
-- open the file; we assume an Oracle directory has already been created
l_infile := utl_file.fopen('CSV_DIRECTORY', l_filename, 'r', l_max_line_length);
-- initialise the empty clob
dbms_lob.createtemporary(lt_report_clob, TRUE, DBMS_LOB.session);
loop
begin
utl_file.get_line(l_infile, l_buffer);
dbms_lob.append(lt_report_clob, l_buffer);
exception
when no_data_found then
exit;
end;
end loop;
insert into report_table (emp_id, report)
values (l_emp_id, lt_report_clob);
-- free the temporary lob
dbms_lob.freetemporary(lt_report_clob);
-- close the file
UTL_FILE.fclose(l_infile);
end;This simple line-by-line approach is easy to understand, and gives you an opportunity (if you want) to take each line in the file and transform it (for example, you could transform it into a nested table, or into XML). However it can be rather slow if there are many records in the csv file - the lob_append operation is not particularly efficient. I was able to improve the efficiency by caching the lines in a VARCHAR2 up to a maximum cache size, and only then appending to the LOB - see [three posts on my blog|http://preferisco.blogspot.com/search/label/lob].
There is at least one other possibility:
- you could use [DBMS_LOB.loadclobfromfile|http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_lob.htm#i998978]. I've not tried this before myself, but I think the procedure is described [here in the 9i docs|http://download.oracle.com/docs/cd/B10501_01/appdev.920/a96591/adl12bfl.htm#879711]. This is likely to be faster than UTL_FILE (because it is all happening in the underlying DBMS_LOB package, possibly in a native way).
That's all for now. I haven't yet answered your question on how to report data back out of the CLOB. I would like to know how you associate employees with files; what happens if there is > 1 file per employee, etc.
HTH
Regards Nigel
Edited by: nthomas on Mar 2, 2009 11:22 AM - don't forget to fclose the file... -
How can I import data from a csv file into databse using utl_file?
Hi,
I have two machines (os is windows and database is oracle 10g) that are not connected to each other and both are having the same database schema but data is all different.
Now on one machine, I want to take dump of all the tables into csv files. e.g. if my table name is test then the exported file is test.csv and if the table name is sample then csv file name is sample.csv and so on.
Now I want to import the data from these csv files into the tables on second machine. if I've 50 such csv files, then data should be written to 50 tables.
I am new to this. Could anyone please let me know how can I import data back into tables. i can't use sqlloader as I've to satisfy a few conditions while loading the data into tables. I am stuck and not able to proceed.
Please let me know how can I do this.
Thanks,
ShilpiWhy you want to export into .csv file.Why not export/import? What is your oracle version?
Read http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php
Regards
Biju
Maybe you are looking for
-
ITunes won't open without ipod attached
I recently updated iTunes to 7.0.1. If I attempt to open iTunes without having my iPod attached it opens but before I can do any thing it is forced closed. A dialogue box pops up informing me that iTunes has encountered a problem and needs to be clos
-
Help! my macbook is in negative mode!
hi, our baby puts his hand on the macbook and make some "keys" (???) and the screen display an image in negative (white is black, green is purple....and vice versa) Anyone can help me to solve this problem? thanks yvan73
-
Transferring / Upgrading license - Acrobat Std.
I purchased and registered Acrobat Standard X in July 2012. In August last year I upgraded to version XI. My Adobe page only lists the serial number for Version X. I now have a new laptop and want to install Acrobat Std.. I can now only download vers
-
Migrating from ACS 4.2.0 to 4.2.1
Team, we are running acs 4.2.0.124.16 on cisco appliance 1113.We need to uprade it to 4.2.1.15 which is the latest release.and need to know the dependencies whether any license required ...??? Need quick help and would be appreciated.
-
Does anyone know if there is a 5 meter apple certified lighting cable available?
Or 3 meters or longer? Links would be great! Thanks!!