Loading a CSV file
I have a file of comma-separated-values from another handheld. I think the fields in each record are not in the order that the Z22 expects. Can someone tell me what order the Z22 expects?
This is not exactly a hotsync question, but that seems like the nearest subject listed.
Post relates to: Palm Z22
Hello bridge01944 and welcome to the Palm forums.
What I would suggest is to create a new record in Palm Desktop for the Mac. In each of the fields, I would type the field name. Then export that record as a .csv file. Then open that .csv file in a spreadsheet application, like Excel, and then open the .csv export file from your old device. Then record the columns in the old file to match the order of the new file. Once you have done that, you should be able to import the data in to Palm Desktop for the Mac.
Alan G
Post relates to: Treo 755p (Sprint)
Similar Messages
-
Error while loading a CSV file
Haii,
While loading a csv file in Open Script I'm facing with the foll: error.
Error loading the CSV file Caused by: java.io.SocketException occurred.
Can anyone please help me sort it out.
Thank YouHi,
Are you creating a table and loading it in Openscript??
If so, can you show the screen shot.
One more information , you can also change the Excel sheet into a CSV file and you can add it as a databank in the Tree view of the Openscript.
Thanks and Regards,
Nishanth Soundararajan. -
Loading a CSV file and accessing the variables
Hi guys,
I'm new to AS3 and dealt with AS2 before (just getting the grasp when the change it).
Is it possible in AS3 to load an excel .csv file into Flash using the URLLoader (or ???) and the data as variables?
I can get the .csv to load and trace the values (cell1,cell2,cell3....) but I'm not sure how to collect the data and place it into variables.
Can I just create an array and access it like so.... myArray[0], myArray[1]? If so, I'm not sure why it's not working.
I must be on the completely wrong path. Here's what I have so far....
var loader:URLLoader = new URLLoader();
loader.dataFormat = URLLoaderDataFormat.VARIABLES;
loader.addEventListener(Event.COMPLETE, dataLoaded);
var request:URLRequest = new URLRequest("population.csv");
loader.load(request);
function dataLoaded(evt:Event):void {
var myData:Array = new Array(loader.data);
trace(myData[i]);
Thanks for any help,
Skyjust load your csv file and use the flash string methods to allocate those values to an array:
var myDate:Array = loader.data.split(","); -
Example for loading a csv file into diadem from a labview application
Hi everyone, i'm using labview 8.2 and DIAdem 10.1.
I've been searching in NI example finder but I had no luck so far.
I have already downloaded the labview connectivity VIs.
Can anyone provide a example that can help me loading a csv file into diadem from a labview application?
ThanksHi Alexandre.
I attach an example for you.
Best Regards.
Message Edité par R_Duval le 01-15-2008 02:44 PM
Romain D.
National Instruments France
#adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
NIDays 2010 : Conférence mondiale de l'instrumentation virtuelle
>>Détails et Inscription<<
Attachments:
Classeur1.csv 1 KB
Load CSV to Diadem.vi 15 KB -
Loading a CSV file into a table
HTML DB Data Workshop has a feature to load a CSV file and create a table based on it.
I use a simplified version of this feature in my applications.
See a quick demo at
http://htmldb.oracle.com/pls/otn/f?p=38131:1
Create a small csv file like
col1,col2,col3
varchar2(10),number,"number(10,2)"
cat,2,3.2
dog,99,10.4
The first line in the file are column names
Second line are their datatypes.
Save the file as c:\test.csv
Click the Browse button, search for this file.
Click the Submit button to upload and parse the file.
You can browse the file contents.
If you like what you see, type in a table name and click the Create Table button to create a table.
Let me know if anyone is interested and I can post the code.
Hope this helps.
Message was edited by:
Vikas
Message was edited by:
VikasHi Vikas,
I tried to import your COOL application into html db at http://htmldb.oracle.com and got the error:
"Expecting p_company or wwv_flow_company cookie to contain security group id of application owner.
Error ERR-7621 Could not determine workspace for application (:) on application accept.
OK "
The SAME error I get when trying to import Scott's Photo Catalog at http://htmldb.oracle.com/pls/otn/f?p=18326:7:6703988766288646534::::P7_ID:2242.
What should I change in the exported script in addition to change your workspace schema VIKASA or Scott's STUDIO_SHOWCASE to my NICED68?
Thanks!
DC -
Loading a CSV file into a table as in dataworkshop
Data Workshop has a feature to load a CSV file and create a table based on it, similarly i want to create it in my application.
i have gone through the forum http://forums.oracle.com/forums/thread.jspa?threadID=334988&start=60&tstart=0
but could not download all the files(application , HTMLDB_TOOLS package and the PAGE_SENTRY function) couldnt find the PAGE_SENTRY function.
AND when i open this link http://apex.oracle.com/pls/apex/f?p=27746
I couldn't run the application. I provided a CSV file and when I click SUBMIT, I got the error:
ORA-06550: line 1, column 7: PLS-00201: identifier 'HTMLDB_TOOLS.PARSE_FILE' must be declared
tried it in apex.oracle.com host as given in the previous post.
any help pls..,
any other method to load data into tables.., (similar to dataworkshop)hi jari,
I'm using the HTMDB_TOOLS to parse my csv files,It works well for creating a report, but if i want to create a new table and upload the data giving error *"missing right parenthesis"*
I've been looking through the package body and i think there is some problem in the code,
IF (p_table_name is not null)
THEN
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN NULL;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P149_DEBUG',l_ddl);
execute immediate l_ddl;
l_ddl := 'insert into '||p_table_name||' '||
'select '||v(p_columns_item)||' '||
'from htmldb_collections '||
'where seq_id > 1 and collection_name='''||p_collection_name||'''';
htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
execute immediate l_ddl;
RETURN;
END IF;it is droping table but not creating new table.
P149_DEBUG contains create table EMP_D (Emp ID number(10),Name varchar2(20),Type varchar2(20),Join Date varchar2(20),Location varchar2(20))and if i comment the
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN NULL;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P149_DEBUG',l_ddl);
execute immediate l_ddl;then it is working well, i.e i enter the exsisting table name then i works fine, inserting the rows.
but unable to create new table
can you pls help to fix it.
Regards,
Little Foot -
Issue while loading a csv file using sql*loader...
Hi,
I am loading a csv file using sql*loader.
On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
ORA-01722: invalid number
I tried checking the value picking from the excel,
and found the chr(13),chr(32),chr(10) values characters on the value.
ex: select length('0.21') from dual is giving a value of 7.
When i checked each character as
select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
I tried the following command....
"to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
to remove all the non-number special characters. But still facing the error.
Please let me know, any solution for this error.
Thanks in advance.
Kirancontrol file:
OPTIONS (ROWS=1, ERRORS=10000)
LOAD DATA
CHARACTERSET WE8ISO8859P1
INFILE '$Xx_TOP/bin/ITEMS.csv'
APPEND INTO TABLE XXINF.ITEMS_STAGE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
ItemNum "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
cross_ref_old_item_num "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
Mas_description "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
Mas_long_description "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
Org_description "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
Org_long_description "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
user_item_type "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
organization_code "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
primary_uom_code "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
inv_default_item_status "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
inventory_item_flag "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
stock_enabled_flag "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
mtl_transactions_enabled_flag "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
revision_qty_control_code "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
reservable_type "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
check_shortages_flag "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
shelf_life_code "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
shelf_life_days "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
lot_control_code "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
auto_lot_alpha_prefix "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_lot_number "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
negative_measurement_error "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
positive_measurement_error "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
serial_number_control_code "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
auto_serial_alpha_prefix "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_serial_number "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
location_control_code "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
restrict_subinventories_code "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
restrict_locators_code "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
bom_enabled_flag "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
costing_enabled_flag "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
inventory_asset_flag "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
default_include_in_rollup_flag "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
cost_of_goods_sold_account "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
std_lot_size "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
sales_account "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
purchasing_item_flag "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
purchasing_enabled_flag "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
must_use_approved_vendor_flag "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
allow_item_desc_update_flag "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
rfq_required_flag "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
buyer_name "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
list_price_per_unit "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
taxable_flag "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
purchasing_tax_code "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
receipt_required_flag "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
inspection_required_flag "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
price_tolerance_percent "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
expense_account "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
allow_substitute_receipts_flag "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
allow_unordered_receipts_flag "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
receiving_routing_code "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
inventory_planning_code "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
min_minmax_quantity "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
max_minmax_quantity "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
planning_make_buy_code "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
source_type "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
mrp_safety_stock_code "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
material_cost "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
mrp_planning_code "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
customer_order_enabled_flag "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
customer_order_flag "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
shippable_item_flag "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
internal_order_flag "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
internal_order_enabled_flag "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
invoice_enabled_flag "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
invoiceable_item_flag "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
cross_ref_ean_code "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
category_set_intrastat "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
CustomCode "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
net_weight "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
production_speed "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
LABEL "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
comment1_org_level "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
comment2_org_level "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
std_cost_price_scala "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
supply_type "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
subinventory_code "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
preprocessing_lead_time "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
processing_lead_time "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
wip_supply_locator "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
Sample data from csv file.
"9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
The load errors out on especially two columns :
1) std_cost_price_scala
2) list_price_per_unit
both are number columns.
And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
Message was edited by:
KK28 -
Loading a CSV file to Sql Server
Hi,
How can i load a csv file, which is in a shared location itn osql using bods.
Where can i create the data source pointing to csv fileHi,
Use fileformat to load CSV file into SQL server, below the details
A file format defines a connection to a file. Therefore, you use a file format to connect to source or target data when the data is stored in a file rather than a database table. The object library stores file format templates that you use to define specific file formats as sources and targets in data flows.
To work with file formats, perform the following tasks:
• Create a file format template that defines the structure for a file.
• Create a specific source or target file format in a data flow. The source or target file format is based on a template and specifies connection information such as the file name.
File format objects can describe files of the following types:
• Delimited: Characters such as commas or tabs separate each field.
• Fixed width: You specify the column width.
• SAP transport: Use to define data transport objects in SAP application data flows.
• Unstructured text: Use to read one or more files of unstructured text from a directory.
• Unstructured binary: Use to read one or more binary documents from a directory.
More details, refer the designer guide
http://help.sap.com/businessobject/product_guides/sbods42/en/ds_42_designer_en.pdf -
Using SQL*Loader to load a .csv file having multiple CLOBs
Oracle 8.1.5 on Solaris 2.6
I want to use SQL*Loader to load a .CSV file that has 4 inline CLOB columns. I shall attempt to give some information about the problem:
1. The CLOBs are not delimited in the field level and could themselves contain commas.
2. I cannot get the data file in any other format.
Can anybody help me out with this? While loading LOB in predetermined size fields, is there a limit on the size?
TIA.
-MuraliThanks for the article link. The article states "...the loader can load only XMLType tables, not columns." Is this still the case with 10g R2? If so, what is the best way to workaround this problem? I am migrating data from a Sybase table that contains a TEXT column (among others) to an Oracle table that contains an XMLType column. How do you recommend I accomplish this task?
- Ron -
Error loading local CSV file into external table
Hello,
I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file ...\...\...nnnnn.log
Please let me know if I can achieve the same functionality I had with Oracle XP.
(Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
Regards,
M.R.user7047382 wrote:
Hello,
I am trying to load a CSV file located on my C:\ drive on a WIndows XP system into an 'external table'. Everyting used to work correctly when using Oracle XE (iinstalled also locally on my WIndows system).
However, once I am trynig to load the same file into a Oracle 11g R2 database on UNIX, I get the following errr:
ORA-29913: Error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
error opening file ...\...\...nnnnn.log
Please let me know if I can achieve the same functionality I had with Oracle XP.
(Note: I cannot use SQL*Loader approach, I am invoking Oracle stored procedures from VB.Net that are attempting to load data into external tables).
Regards,
M.R.So your database is on a unix box, but the file is on your Windows desktop machine? So .... how is it you are making your file (on your desktop) visible to your database (on a unix box)??????? -
How to load 100 CSV file to DSO / InfoCube
Hi
How to load 100 CSV files of transcational data to DSO / InfoCube?Hi,
you can achieve to write the .bat (batch file program in windows)
.bat program it will convert to xsl to .CSV format.
in process chain you can use the OS COMMAND option while creating process chain.
it will first trigger the bat file then it will trigger the excel files and info package.
now i am also working on same scenario now.
please find the below doc how to create the process chain using os command.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0c41c52-2638-2e10-0b89-ffa4e2f10ac0?QuickLink=index&…
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/3034bf3d-9dc4-2d10-9e93-d4ba01505ed9?QuickLink=index&…
Thanks,
Phani. -
Memory problem with loading a csv file and displaying 2 xy graphs
Hi there, i'm having some memory issues with this little program.
What i'm trying to do is reading a .csv file of 215 mb (6 million lines more or less), extracting the x-y values as 1d array and displaying them in 2 xy graphs (vi attacked).
I've noticed that this process eats from 1.6 to 2 gb of ram and the 2 x-y graphs, as soon as they are loaded (2 minutes more or less) are really realy slow to move with the scrollbar.
My question is: Is there a way for use less memory resources and make the graphs move smoother ?
Thanks in advance,
Ierman Gert
Attachments:
read from file test.vi 106 KBHi lerman,
how many datapoints do you need to handle? How many do you display on the graphs?
Some notes:
- Each graph has its own data buffer. So all data wired to the graph will be buffered again in memory. When wiring a (big) 1d array to the graph a copy will be made in memory. And you mentioned 2 graphs...
- load the array in parts: read a number of lines, parse them to arrays as before (maybe using "spreadsheet string to array"?), finally append the parts to build the big array (may lead to memory problems too).
- avoid datacopies when handling big arrays. You can show buffer creation using menu->tools->advanced->show buffer allocation
- use SGL instead of DBL when possible...
Message Edited by GerdW on 05-12-2009 10:02 PM
Best regards,
GerdW
CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
Kudos are welcome -
How to load excel-csv file into oracle database
Hi
I wanted to load excel file with csv extension(named as trial.csv) into
oracle database table(named as dept),I have given below my experiment here.
I am getting the following error,how should I rectify this?
For ID column I have defined as number(5) datatype, in my control file I have defined as interger external(5),where as in the Error log file why the datatype for ID column is comming as character?
1)my oracle database table is
SQL> desc dept;
Name Null? Type
ID NUMBER(5)
DNAME CHAR(20)
2)my data file is(trial.csv)
ID DNAME
11 production
22 purchase
33 inspection
3)my control file is(trial.ctl)
LOAD DATA
INFILE 'trial.csv'
BADFILE 'trial.bad'
DISCARDFILE 'trial.dsc'
APPEND
INTO TABLE dept
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(ID POSITION(1:5) INTEGER EXTERNAL(5)
NULLIF ID=BLANKS,
DNAME POSITION(6:25) CHAR
NULLIF DNAME=BLANKS
3)my syntax on cmd prompt is
c:\>sqlldr scott/tiger@xxx control=trial.ctl direct=true;
Load completed - logical record count 21.
4)my log file error message is
Column Name Position Len Term Encl Datatype
ID 1:5 5 , O(") CHARACTER
NULL if ID = BLANKS
DNAME 6:25 20 , O(") CHARACTER
NULL if DNAME = BLANKS
Record 1: Rejected - Error on table DEPT, column ID.
ORA-01722: invalid number
Record 21: Rejected - Error on table DEPT, column ID.
ORA-01722: invalid number
Table DEPT:
0 Rows successfully loaded.
21 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 21
Total logical records rejected: 21
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 0
Total stream buffers loaded by SQL*Loader load thread: 0
5)Result
SQL> select * from dept;
no rows selected
by
balamuralikrishnan.sHi everyone!
The following are the steps to load a excell sheet to oracle table.i tried to be as simple as possible.
thanks
Step # 1
Prapare a data file (excel sheet) that would be uploaded to oracle table.
"Save As" a excel sheet to ".csv" (comma seperated values) format.
Then save as to " .dat " file.
e.g datafile.bat
1,Category Wise Consumption Summary,Expected Receipts Report
2,Category Wise Receipts Summary,Forecast Detail Report
3,Current Stock List Category Wise,Forecast rule listing
4,Daily Production Report,Freight carrier listing
5,Daily Transactions Report,Inventory Value Report
Step # 2
Prapare a control file that define the data file to be loaded ,columns seperation value,name and type of the table columns to which data is to be loaded.The control file extension should be " .ctl " !
e.g i want to load the data into tasks table ,from the datafile.dat file and having controlfile.ctl control file.
SQL> desc tasks;
Name Null? Type
TASK_ID NOT NULL NUMBER(14)
TASK_NAME VARCHAR2(120)
TASK_CODE VARCHAR2(120)
: controlfile.ctl
LOAD DATA
INFILE 'e:\datafile.dat'
INTO TABLE tasks
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(TASK_ID INTEGER EXTERNAL,
TASK_NAME CHAR,
TASK_CODE CHAR)
The above is the structure for a control file.
Step # 3
the final step is to give the sqlldr command to execute the process.
sqlldr userid=scott/tiger@abc control=e:\controlfile.ctl log=e:\logfile.log
Message was edited by:
user578762 -
Error in date format when I load a CSV file
I am using Oracle G10 XE and I am trying to load data into my database from comma separated files.
When I load the data from a CSV file which has the date with the following format "DD/MM/YYYY", I received the following error "ORA-01843: not a valid month".
I have the NSL_LANG set to AMERICAN. I have tried the following command: "ALTER SESSION SET NLS DATE FORMAT="DD/MM/YYYY" and this does nothing. When I try to run "SELECT SYSDATE "NOW" FROM DUAL;" I get the date in this format "10-NOV-06".
I will appreciate any help about migrating my data with date fields in format DD//MM/YYYY.
Sincerely,
PolonioSee Re: Get error in date when I load a CSV file
-
How to load multiple CSV files into oracle in one go.
Hi,
I have project requirement like: I'll be getting one csv file as one record for the source table.
So i need to know is there any way by which I can load multiple csv in one go. I have searched a lot on net about external table.(but problem is I can only use one consolidate csv at a time)
and UTL_FILE same thing consolidate file is required here to load.
but in my scenario I'll have (1000+) csv files(as records) for the table and it'd be hectic thing to open each csv file,copy the record in it and paste in other file to make consolidate one.
Please help me ..it's very new thing for me...I have used external table for , one csv file in past but here I have to user many file.
Table description given below.
desc td_region_position
POSITION_KEY NUMBER Primary key
POSITION_NAME VARCHAR2(20)
CHANNEL VARCHAR2(20)
LEVEL VARCHAR2(20)
IS_PARTNER CHAR(1)
MARKET_CODE VARCHAR2(20)
CSV file example:
POSITION_KEY|POSITION_NAME|CHANNEL|LEVEL|IS_PARTNER|MARKET_CODE
123002$$FLSM$$Sales$$Middle$$Y$$MDM2203
delimeter is -- $$my database version as follows:
Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi
PL/SQL Release 10.2.0.5.0 - Production
"CORE 10.2.0.5.0 Production"
TNS for IBM/AIX RISC System/6000: Version 10.2.0.5.0 - Productio
NLSRTL Version 10.2.0.5.0 - ProductionEdited by: 974253 on Dec 10, 2012 9:58 AMif your csv files have some mask, say "mask*.csv" or by file "mask1.csv" and "mask2.csv" and ...
you can create this.bat file
FOR %%c in (C:\tmp\loader\mask*.csv) DO (
c:\oracle\db\dbhome_1\BIN\sqlldr <user>@<sid>l/<password> control=C:\tmp\loader\loader.ctl data=%%c
)and C:\tmp\loader\loader.ctl is
OPTIONS (ERRORS=0,SKIP=1)
LOAD DATA
APPEND
INTO TABLE scott.td_region_position
FIELDS TERMINATED BY '$$' TRAILING NULLCOLS
( POSITION_KEY,
POSITION_NAME ,
CHANNEL,
LVL,
IS_PARTNER,
MARKET_CODE
)test
C:\Documents and Settings\and>sqlplus
SQL*Plus: Release 11.2.0.1.0 Production on Mon Dec 10 11:03:47 2012
Copyright (c) 1982, 2010, Oracle. All rights reserved.
Enter user-name: scott
Enter password:
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select * from td_region_position;
no rows selected
SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Pr
oduction
With the Partitioning, OLAP, Data Mining and Real Application Testing options
C:\Documents and Settings\and>cd C:\tmp\loader
C:\tmp\loader>dir
Volume in drive C has no label.
Volume Serial Number is F87F-9154
Directory of C:\tmp\loader
12/10/2012 10:51 AM <DIR> .
12/10/2012 10:51 AM <DIR> ..
12/10/2012 10:55 AM 226 loader.ctl
12/10/2012 10:38 AM 104 mask1.csv
12/10/2012 10:39 AM 108 mask2.csv
12/10/2012 10:57 AM 151 this.bat
4 File(s) 589 bytes
2 Dir(s) 4,523,450,368 bytes free
C:\tmp\loader>this.bat
C:\tmp\loader>FOR %c in (C:\tmp\loader\mask*.csv) DO (c:\oracle\db\dbhome_1\BIN\
sqlldr user@orcl/password control=C:\tmp\loader\loader.ctl data=%c )
C:\tmp\loader>(c:\oracle\db\dbhome_1\BIN\sqlldr user@orcl/password control=C
:\tmp\loader\loader.ctl data=C:\tmp\loader\mask1.csv )
SQL*Loader: Release 11.2.0.1.0 - Production on Mon Dec 10 11:04:27 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
C:\tmp\loader>(c:\oracle\db\dbhome_1\BIN\sqlldr user@orcl/password control=C
:\tmp\loader\loader.ctl data=C:\tmp\loader\mask2.csv )
SQL*Loader: Release 11.2.0.1.0 - Production on Mon Dec 10 11:04:28 2012
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 1
C:\tmp\loader>sqlplus
SQL*Plus: Release 11.2.0.1.0 Production on Mon Dec 10 11:04:46 2012
Copyright (c) 1982, 2010, Oracle. All rights reserved.
Enter user-name: scott
Enter password:
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
SQL> select * from td_region_position;
POSITION_KEY POSITION_NAME CHANNEL LVL I
MARKET_CODE
123002 FLSM Sales Middle Y
MDM2203
123003 FLSM1 Sa2les M2iddle Y
MDM22203
SQL> -
Get error in date when I load a CSV file
I am using Oracle G10 XE and I am trying to load data into my database from comma separated files.
When I load the data from a CSV file which has the date with the following format "DD/MM/YYYY", I received the following error "ORA-01843: not a valid month".
I have the NSL_LANG set to AMERICAN. I have tried the following command: "ALTER SESSION SET NLS DATE FORMAT="DD/MM/YYYY" and this does nothing. When I try to run "SELECT SYSDATE "NOW" FROM DUAL;" I get the date in this format "10-NOV-06".
I will appreciate any help about migrating my data with date fields in format DD//MM/YYYY.
Sincerely,
PolonioSee this example :
[oracle work XE]$ cat test.dat
1,11/10/2006
2,01/11/2006
3,06/11/2006
[oracle work XE]$ cat test.ctl
load data
infile 'test.dat'
replace
into table test
fields terminated by ','
trailing nullcols
a integer external,
b "to_date(:b,'dd/mm/yyyy')"
[oracle work XE]$ sqlldr test/test control=test.ctl
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Nov 10 21:38:19 2006
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Commit point reached - logical record count 3
[oracle work XE]$ sqlplus test/test
SQL*Plus: Release 10.2.0.1.0 - Production on Fri Nov 10 21:38:25 2006
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to:
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
TEST@XE SQL> desc test
Name Null? Type
A NUMBER
B DATE
TEST@XE SQL> select * from test;
A B
1 11-OCT-06
2 01-NOV-06
3 06-NOV-06
TEST@XE SQL>
Maybe you are looking for
-
How to export proper HD video?
Hi all, I am new to FCE. I am importing and edited around 2 hr HD video and want to export as normal DVD. but I got trouble when selecting the format to export. when choosing "Quick time Movie" the output is around 11GB! thats too big to put into a D
-
Every time I close an app, it freezes
Every time I close an app, the screen freezes. I have to reset most of the time.is there a fix for this?
-
Why does my newly published website look different in every browser?
It looks closest to what I created when viewing in IE. Graphics are screwed up in Firefox, and colors are wrong in Safari. Also, my add to cart buttons work in Firefox and Safari but not in IE. Can anyone help??
-
Although it used to workk seamlessly, the Airport Express keeps disconnecting after a little while. This means that every time I want to stream, I have to reconnect, and then works perfectly, until I come back the next day and find it gone... has an
-
No internet for safari and .mac, but works for iChat and 3rd party browsers
After confusing safari (switching links while loading) sometimes I loose the internet connection for .mac services and safari. Safari will not connect to the internet, iDisk will not load or show in network prefs, etc. However, network still shows an