SQL Loader not loading
Hi,
I have a client table with following description
CLIENT_ID NUMBER(10)
FIRST_NAME VARCHAR2(10)
LAST_NAME VARCHAR2(20)
I have client.dat file as
2001, Joe, Soap
2002, Jim, Bloggs
2003, Jimmy, Bean
I have SQL Loader control file client.ctl as
LOAD DATA
INFILE 'D:\client.dat'
INTO TABLE client APPEND
FIELDS TERMINATED BY "," TRAILING NULLCOLS
CLIENT_ID INTEGER,
FIRST_NAME CHAR(10),
LAST_NAME CHAR(20)
I am running sqlldr control=d:\client.ctl
I get the message
Commit point reached - logical record count 2
commit point reached - logical record count 3
But at the end of this I do not see these two records appended in the table.
Please help
Thanks
Hi!
Check whether there is any bad file generates there or not. If it is - generates then see what are the records are rejected. And, also check the corresponding log file - what is the message it is giving. So, far i don't think there is any error in your code.
Regards.
Satyaki De.
Similar Messages
-
Decode Not working in sql loader
I had a requirement of loading flatfile into staging table using SQL Loader, One of the columns in the the Flat file is having values FALSE or TRUE and my requirement is that I load 0 for FALSE and 1 for TRUE which can be achieved by simple DECODE function...I did use decode and tried to load several times but did not work. What might be the problem
LOAD DATA
INFILE 'sql_4ODS.txt'
BADFILE 'SQL_4ODS.badtxt'
APPEND
INTO TABLE members
FIELDS TERMINATED BY "|"
( Person_ID,
FNAME,
LNAME,
Contact,
status "decode(:status, 'TRUE', '1','FALSE','0')"
I did try putting a trim as well as SUBSTR but did not work....the cloumn just doent get any values in the output (just null or say free space)
Any help would be great.....Hello user8937215.
Please provide a create table statement and a sample of data file contents. I would expect DECODE or CASE to work based on the information provided.
Cheers,
Luke
Please mark the answer as helpful or answered if it is so. If not, provide additional details.
Always try to provide create table and insert table statements to help the forum members help you better. -
SQl loader is not working in 10g but working in 9i
In My PC i have installed both 9i and 10g. could you please tell me why it is not working
"SQL loader is not working" is rather like "my car won't start" http://tkyte.blogspot.com/2005/06/how-to-ask-questions.html
The 9i Oracle Client install includes sqlldr. The 10g Oracle Client install does NOT include sqlldr. You'd have to do a "Custom" Install and specifically select "Oracle Database Utilities" to install.
See MetaLink Note#437377.1
Edited by: Hemant K Chitale on Dec 31, 2008 3:58 PM
Edited by: Hemant K Chitale on Dec 31, 2008 4:00 PM -
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */ -
SQL LOADER Problem when data is loaded but not come in standard formate
Hi guys,
I got problem when sql loader run data loaded successfully in table but UOM data not come in standard formate.
UOM table column contains the Unit of measure data but in my excel sheet it's look like :
EXCEl SHEET DATA:
1541GAFB07080 0 Metres
1541GAFE10040 109.6 Metres
1541GAFE10050 594.2 Metres
1541GAFE10070 126.26 Metres
1541GAFE14040 6.12 Metres
1541GAFE14050 0 Metres
1541SAFA05210 0 Metres
1541SAFA07100 0 Metres
1551EKDA05210 0 Nos
1551EKDA07100 0 Nos
1551EKDA07120 0 Nos
1551EKDA07140 0 Nos
1551EKDA07200 0 Nos.
1551EKDA08160 0 Nos.
1551EKDA08180 0 Nos.
1551EKDA08200 0 Nos.
1551EKDA10080 41 Nos.
1551EKDA10140 85 Nos.
.ctl file :
OPTIONS (silent=(header,feedback,discards))
LOAD DATA
INFILE *
APPEND
INTO TABLE XXPL_PO_REQUISITION_STG
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY'"'
TRAILING NULLCOLS
( ITEM_CODE CHAR,
ITEM_DESCRIPTION CHAR "TRIM(:ITEM_DESCRIPTION)",
QUANTITY,
UOM,
NEED_BY_DATE,
PROJECT,
TASK_NAME,
BUYER,
REQ_TYPE,
STATUS,
ORGANIZATION_CODE,
LOCATION,
SUBINVENTORY,
LINE_NO,
REQ_NUMBER,
LOADED_FLAG CONSTANT 'N',
SERIAL_NO "XXPL_PRREQ_SEQ.NEXTVAL",
CREATED_BY,
CREATION_DATE SYSDATE,
LAST_UPDATED_BY,
LAST_UPDATED_DATE,
LAST_UPDATED_LOGIN
Some output came in table like:
W541WDCA05260 0 Metres|
W541WDCA05290 3 Metres|
W541WDCA05264 4 Metres|
W541WDCA05280 8 Metres|
1551EADA04240 0 Nos|
1551EADA07100 0 Nos|
1551EKDA10080 0 Nos.|
1551EKDA10080 41 Nos.|
proble in | delimiter...how to remove it ' | ' from my table when sqlloader program runnig ...... where i can change in .ctl file or excel file....it's urgent guys olz help me ..
thanksHi,
How are you extracting the data to Excel sheet ?
Please check the format type of the column in Excel sheet for UOM.
There is no issue in the SQL loader control file, but issue is there in your source excel file. (Try using a different method to extract the data to Excel sheet.)
Regards,
Yuvaraj.C -
Sql loader not able to load more than 4.2 billion rows
Hi,
I am facing a problem with Sql loader utility.
The Sql loader process stops after loading 342 gigs of data i.e it seems to load 4.294 billion rows out of 6.9 billion rows and the loader process completes without throwing any error.
Is there any limit on the volume of data sql loader can load ?
Also how to identify that SQL loader process has not loaded whole data from the file as it is not throwing any error ?
Thanks in Advance.it may be a prob with the db config not in the sql loader...
check all the parameters
Maximizing SQL*Loader Performance
SQL*Loader is flexible and offers many options that should be considered to maximize the speed of data loads. These include:
1. Use Direct Path Loads - The conventional path loader essentially loads the data by using standard insert statements. The direct path loader (direct=true) loads directly into the Oracle data files and creates blocks in Oracle database block format. The fact that SQL is not being issued makes the entire process much less taxing on the database. There are certain cases, however, in which direct path loads cannot be used (clustered tables). To prepare the database for direct path loads, the script $ORACLE_HOME/rdbms/admin/catldr.sql.sql must be executed.
2. Disable Indexes and Constraints. For conventional data loads only, the disabling of indexes and constraints can greatly enhance the performance of SQL*Loader.
3. Use a Larger Bind Array. For conventional data loads only, larger bind arrays limit the number of calls to the database and increase performance. The size of the bind array is specified using the bindsize parameter. The bind array's size is equivalent to the number of rows it contains (rows=) times the maximum length of each row.
4. Use ROWS=n to Commit Less Frequently. For conventional data loads only, the rows parameter specifies the number of rows per commit. Issuing fewer commits will enhance performance.
5. Use Parallel Loads. Available with direct path data loads only, this option allows multiple SQL*Loader jobs to execute concurrently.
$ sqlldr control=first.ctl parallel=true direct=true
$ sqlldr control=second.ctl parallel=true direct=true
6. Use Fixed Width Data. Fixed width data format saves Oracle some processing when parsing the data. The savings can be tremendous, depending on the type of data and number of rows.
7. Disable Archiving During Load. While this may not be feasible in certain environments, disabling database archiving can increase performance considerably.
8. Use unrecoverable. The unrecoverable option (unrecoverable load data) disables the writing of the data to the redo logs. This option is available for direct path loads only.
Edited by: 879090 on 18-Aug-2011 00:23 -
Sql loader does not skip header rows from the second infile
Hi
I am new to Sql Loader and trying to figure out a way to load data into the table from 2 files which have different data in the same format. The files get successfully loaded into the database with just one glitch:
Its skips first 2 rows from the first file however it takes first 2 rows from the 2nd file which I do not want. Can anyone help me with this issue? How can i restrict loader from picking up the 2 header rows from the second file as well?
given below is the content of the control file
OPTIONS ( SKIP=2)
LOAD DATA
INFILE 'C:\loader\contacts\Italy_Wave11_Contacts.csv'
BADFILE 'C:\loader\contacts\Contacts.bad'
DISCARDFILE 'C:\loader\contacts\contacts.dsc'
infile 'C:\loader\contacts\Spain_Wave11_Contacts.csv'
BADFILE 'C:\loader\contacts\Contacts1.bad'
DISCARDFILE 'C:\loader\contacts\contacts1.dsc'
Truncate
into table V_contacts_dump
fields terminated by ',' optionally enclosed by '"'
trailing nullcols
(ASSISTANT_EMAIL_TX,
ASSISTANT_NM,
ASSISTANT_PHONE_TX,
BUSINESS_AREA_CD,
BUSINESS_EMAIL_TX,
BUSINESS_FAX_TX,
BUYER_ROLE_CD,
COMMENTS_TX,
COUNTRY_CD,
COUNTY_STATE_PROVINCE_CD,
DATE_OF_BIRTH_DT,
DO_NOT_CALL_IN,
DO_NOT_EMAIL_IN,
DO_NOT_MAIL_IN,
DOMESTIC_PARTNERS_NM,
FIRST_NM,
FULL_NM,
GENDER_CD,
INTERESTS_CD,
LAST_NM,
MIDDLE_NM,
MOBILE_TX,
OFFICE_PHONE_TX,
OWNER_PARTY_EMAIL,
PREFERRED_CONTACT_LANG_CD,
PREFERRED_CONTACT_CD,
PRIMARY_CONTACT_IN,
REFERRED_BY_TX,
SALUTATION_CD,
STAC_SRC_ID_TX,
STAC_STDS_SRC_ID,
STCN_SRC_ID_TX,
STREET_TX,
STREET2_TX,
STREET3_TX,
SUFFIX_TX,
TITLE_CD,
TOWN_CITY_TX,
ZIP_POSTAL_CD_TX )It would be possible to call the loader twice with only one input-File at each run.
-
Oracle 9i - did not install with SQL Loader
Installed Oracle 9i (standard, personal would not install) on XP home edition but SQL Loader did not come this this and I would like to import to 9i. How can I accomplish this? I can not locate any tools included in the installation.
Thank you!!!I think the SQL Loader do not come with Oracle personal edition, at least it didn't on my 8.0.x personal edition before.
But I think there are few options:
1. You may install Enterprise edition administrator client on the same machine. It comes with some admin tools like sql loader (sqlldr or sometime it may be some different name with different version). Be very careful on versions, if it may conflick with your existing version, you may consider to install to a new Oracle home and make sql net connection to connect your personal edition's home and then sql load.
2. The faster way and the prefered way is to install ODBC driver, use some other Windows application like FoxPro to read a file and insert into Oracle via ODBC.
3. Use Oracle Forms to connect your database and use Text I/O commands to read file and then insert into database.
Hope this help. -
SQL*Loader does not recognise the \ in directory
Hi there,
We have version OWB 9.2.0.2.8 and I am trying to run SQL*Loader. I have tried to use an external table and also got "cannot find file" which I suspect could be the same problem. I then tried to load the data with SQL*Loader and saw that the directory specification of the source data file has "mysteriously" lost the \'s.
Below is a copy of the .log file with data file specified incorrectly. It should be u:\bi\data\ocean_shipment.csv. Any ideas?
Data File: u:biocean_shipment.csv
Bad File: /opt/oracle/product/OWB/9.2.0/owb/temp/u:biocean_shipment.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 200 rows, maximum of 50000 bytes
Continuation: none specified
Path used: Conventional
Table "DWHSTG"."S_SHIPMENT_TYPES", loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
"SHIPMENT_CD" 1 * , O(") CHARACTER
"SHIPMENT_NAME" NEXT * , O(") CHARACTER
"LOAD_DATE" SYSDATE
SQL*Loader-500: Unable to open file (u:biocean_shipment.csv)
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.Jean-Pierre,
That was the first thing I checked - even put it in (with \'s on the configuration parameters of the mapping). ALso unregistered and re-registered the location, ensured I put the slashes in, but to no avail. But I do belief there might be a bug outstanding for the location with directory separators - will check on metalink and let you know. -
According to sql loaders log file, I was successful in loading my data into a table. However the table does not show any values loaded - all empy.
Here's my table:
SQL> desc patients
Name Null? Type
PATIENT_ID NOT NULL NUMBER(38)
MEDSUBSCRIBERID VARCHAR2(200)
MEDMEMBERID VARCHAR2(50)
MEDDEPSEQDESC VARCHAR2(50)
MEDMEMBERGENDER VARCHAR2(50)
MEDMEMBERDOB DATE
MEDDEPSEQCODE VARCHAR2(50)
NETWORKID NUMBER(38)
NETWORKCODE VARCHAR2(50)
PATIENTAGE NUMBER(38)
PATIENTNAME VARCHAR2(50)
PATIENTLASTNAME VARCHAR2(50)
PATIENTSEX VARCHAR2(50)
PATIENTSSN VARCHAR2(50)
INTMEDSUBSCRIBERID VARCHAR2(50)
DATE_VALUE_ID NUMBER(38)
DOB VARCHAR2(50)
AGE_GROUP_ID NUMBER(38)
ORGANIZATIONNETWORK_ID NUMBER(38)
ORGANIZATION_ID NUMBER(38)
ORGANIZATIONCODE VARCHAR2(50)
NETWORKID_HOLD NUMBER(38)
PATIENTRELATION VARCHAR2(50)
Here's my SQL Loader Control File:
LOAD DATA
INFILE 'patients.dat'
APPEND
INTO TABLE PATIENTS
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
PATIENT_ID INTEGER(8) NULLIF(PATIENT_ID=BLANKS)
, MEDSUBSCRIBERID CHAR NULLIF(MEDSUBSCRIBERID=BLANKS)
, MEDMEMBERID CHAR NULLIF(MEDMEMBERID=BLANKS)
, MEDDEPSEQDESC CHAR NULLIF(MEDDEPSEQDESC=BLANKS)
, MEDMEMBERGENDER CHAR NULLIF(MEDMEMBERGENDER=BLANKS)
, MEDMEMBERDOB DATE "YYYY-MM-DD HH24:MI:SS" NULLIF(MEDMEMBERDOB=BLANKS)
, MEDDEPSEQCODE CHAR NULLIF(MEDDEPSEQCODE=BLANKS)
, NETWORKID INTEGER(8) NULLIF(NETWORKID=BLANKS)
, NETWORKCODE CHAR NULLIF(NETWORKCODE=BLANKS)
, PATIENTAGE INTEGER(8) NULLIF(PATIENTAGE=BLANKS)
, PATIENTNAME CHAR NULLIF(PATIENTNAME=BLANKS)
, PATIENTLASTNAME CHAR NULLIF(PATIENTLASTNAME=BLANKS)
, PATIENTSEX CHAR NULLIF(PATIENTSEX=BLANKS)
, PATIENTSSN CHAR NULLIF(PATIENTSSN=BLANKS)
, INTMEDSUBSCRIBERID CHAR NULLIF(INTMEDSUBSCRIBERID=BLANKS)
, DATE_VALUE_ID INTEGER(8) NULLIF(DATE_VALUE_ID=BLANKS)
, DOB CHAR NULLIF(DOB=BLANKS)
, AGE_GROUP_ID INTEGER(8) NULLIF(AGE_GROUP_ID=BLANKS)
, ORGANIZATIONNETWORK_ID INTEGER(8) NULLIF(ORGANIZATIONNETWORK_ID=BLANKS)
, ORGANIZATION_ID INTEGER(8) NULLIF(ORGANIZATION_ID=BLANKS)
, ORGANIZATIONCODE CHAR NULLIF(ORGANIZATIONCODE=BLANKS)
, NETWORKID_HOLD INTEGER(8) NULLIF(NETWORKID_HOLD=BLANKS)
, PATIENTRELATION CHAR NULLIF(PATIENTRELATION=BLANKS)
Here's data file that I'm trying to load:
2|001321666|00132156801||M|1944-01-22 00:00:00||42|7215||Donald|Duck||||52616|19440666|4|42|2|WELLS|42|E
3|001321999|00132156802||F|1951-09-01 00:00:00||42|7215||Mickey|Mouse||||55395|19510999|3|42|2|WELLS|42|W
4|001363888|00136366201||M|1955-05-22 00:00:00||42|7215||Daffy|Duck||||56754|19550777|3|42|2|WELLS|42|E
And, my log file after executing sql loader successfully but the data does not load:
Table PATIENTS:
3 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 251776 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 3
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Wed Aug 20 11:51:51 2008
Run ended on Wed Aug 20 11:51:57 2008
Elapsed time was: 00:00:05.47
CPU time was: 00:00:00.02Very strange.
Did you connect as the same user as SQL*Loader when you checked the table?
Is it possible that someone else deleted the data? -
SQL*Loader-971: parallel load option not allowed when loading lob columns
Hi,
I am trying to load a table, which has a VARRAY column, using DIRECT=TRUE and PARALLEL=TRUE through
Sql *Loader 10.2.0.4.0
OS: Sun Solaris 10 SPARC 64-bit,
Database: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0
The following error recieved:
SQL*Loader-971: parallel load option not allowed when loading lob columns
Please help me to resolve..
Thanks and regrds
Anjiuser8836881 wrote:
Hi,
I am trying to load a table, which has a VARRAY column, using DIRECT=TRUE and PARALLEL=TRUE through
Sql *Loader 10.2.0.4.0
OS: Sun Solaris 10 SPARC 64-bit,
Database: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0
The following error recieved:
SQL*Loader-971: parallel load option not allowed when loading lob columns
Please help me to resolve..
Thanks and regrds
Anjihttp://tinyurl.com/yhxdhnt -
Sql Loader by using shell script, not able to insert data
Hi,
I am trying to dump the data by using shell script.(in shell script i am having sqlldr command)(its a host excutable method cocurrent program)
When i am loading the data, by placing my files(.ctl,.prog,.csv,symbolink file for .prog) in $Custom_top/bin, it is loading exactly. 17000 records inserted.
But if i am loading the data by placing my files in $custom_top/custom_folders. unable to insert total data. only 43 records inserting.
Please any one can help me.
Thanks in advance.
Rama.Srini, Thanks a lot for ur reply,
Oracle Apps version R12,
Microsoft windows XP profissional
Version 2002 service Pack 3
My Control file Script is:
load data
infile '$XADP_TOP/data/CPIU/in/XXOKS_Price_Increase.csv'
append
into table XXOKS_CONTRACT_PRICE_INCR_DTLS
fields terminated BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(EXCLUSION_FLAG,
LEGACY_NUMBER,
CUSTOMER_NUMBER,
CUSTOMER_NAME,
REQUEST_ID,
CONTRACT_NUMBER,
CONTRACT_START_DATE,
CONTRACT_END,
REQUEST_LINE_ID,
LINE_START_DATE,
LINE_END_DATE,
ITEM_NUMBER,
ITEM_DESCRIPTION,
UNIT_PRICE,
QTY,
NEW_UNIT_PRICE,
LINE_AMOUNT,
NEW_LINE_AMOUNT,
PRICE_INCREASED_DATE,
PERCENTAGE_INCREASED,
ORIGINAL_CONTRACT_AMOUNT,
NEW_CONTRACT_AMOUNT,
PRICE_INCREASE_AMOUNT)
My .prog File is: Please fidn that i created symbolink file also for my .prog.
if [ -z $XADP_TOP ];then
echo "XADP_TOP environment variable is not set!"
exit 1
fi
cd $XADP_TOP/data/CPIU/in
DATE=`date +%y%m%d:%H%M`
i_program_name="$0"
i_ora_pwd="$1"
i_user_id="$2"
i_user_name="$3"
i_request_id="$4"
i_ftp_host_name="$5"
i_ftp_user_name="$6"
i_ftp_user_password="$7"
ftp_prog() {
# FTP Function to reuse the FTP Commands
if [ $# -ne 6 ];then
echo "Usage : ftp_prog <Hostname> <User name> <Password> <Remote Directory> <command> <filename>"
exit 2
fi
l_ftp_host_name="$1"
l_ftp_user_name="$2"
l_ftp_user_password="$3"
l_ftpdir="$4"
l_ftp_command="$5"
l_ftp_filename="$6"
ftp -v -n ${l_ftp_host_name} <<EOF
user ${l_ftp_user_name} ${l_ftp_user_password}
ascii
cd ${l_ftpdir}
${l_ftp_command} ${l_ftp_filename}
quit
EOF
#exit $?
# setting the ftp directory
#ftpdir="/`echo ${TWO_TASK:-$ORACLE_SID}|tr "[A-Z]" "[a-z]"`/CPIU"
##ftpdir="/FinTEST/quoting/PS/ar"
ftpdir="$XADP_TOP/data/CPIU/in"
# setting the in directory and out directory
indir="$XADP_TOP/data/CPIU/in"
outdir="$XADP_TOP/data/CPIU/out"
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} get XXOKS_Price_Increase.csv
echo $ftpdir
echo "Converting the data file into unix mode"
dos2unix XXOKS_Price_Increase.csv XXOKS_Price_Increase.csv
chmod 777 XXOKS_Price_Increase.csv
cd $XADP_TOP/bin
echo "Trying to excute sqlldr and entering into the into control file"
$ORACLE_HOME/bin/sqlldr userid=$i_ora_pwd control=XXOKS_PRICE_INCR_LOAD log=$XADP_TOP/log/XXOKS_PRICE_INCR_LOAD_${DATE}.log;
exit_status=$?
echo "Checking the status and giving permissions to the data file which in in dir"
if [ $exit_status -eq 0 ]; then
cd $XADP_TOP/data/CPIU/in
chmod 777 XXOKS_Price_Increase.csv
echo "try to move data file into out dir"
# Moving the file to out directory
mv XXOKS_Price_Increase.csv ${outdir}/XXOKS_Price_Increase.csv_${DATE}
#echo "ready to zip file in out dir step6"
# Zipping the file
#gzip -f ${outdir}/XXOKS_Price_Increase.csv_${DATE}
echo "deleting the file which is in dir"
# Deleting the file from in directory
/bin/rm -f ${indir}/XXOKS_Price_Increase.csv
# Deleting from the remote directory
ftp_prog ${i_ftp_host_name} ${i_ftp_user_name} ${i_ftp_user_password} ${ftpdir} delete XXOKS_Price_Increase.csv
echo "sqlloader finished successfully."
else
echo "Error in loader"
##echo "Loader error in Price Increase Detials File ${i_file}"
fi
exit $exit_status
And My Log file Comments are
SQL*Loader: Release 10.1.0.5.0 - Production on Thu Dec 3 01:32:08 2009
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: XXOKS_PRICE_INCR_LOAD.ctl
Data File: /oesapp/applmgr/GIS11/apps/apps_st/appl/xadp/12.0.0/data/CPIU/in/XXOKS_Price_Increase.csv
Bad File: XXOKS_Price_Increase.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table XXOKS_CONTRACT_PRICE_INCR_DTLS, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
EXCLUSION_FLAG FIRST * , O(") CHARACTER
LEGACY_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NUMBER NEXT * , O(") CHARACTER
CUSTOMER_NAME NEXT * , O(") CHARACTER
REQUEST_ID NEXT * , O(") CHARACTER
CONTRACT_NUMBER NEXT * , O(") CHARACTER
CONTRACT_START_DATE NEXT * , O(") CHARACTER
CONTRACT_END NEXT * , O(") CHARACTER
REQUEST_LINE_ID NEXT * , O(") CHARACTER
LINE_START_DATE NEXT * , O(") CHARACTER
LINE_END_DATE NEXT * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
ITEM_DESCRIPTION NEXT * , O(") CHARACTER
UNIT_PRICE NEXT * , O(") CHARACTER
QTY NEXT * , O(") CHARACTER
NEW_UNIT_PRICE NEXT * , O(") CHARACTER
LINE_AMOUNT NEXT * , O(") CHARACTER
NEW_LINE_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASED_DATE NEXT * , O(") CHARACTER
PERCENTAGE_INCREASED NEXT * , O(") CHARACTER
ORIGINAL_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
NEW_CONTRACT_AMOUNT NEXT * , O(") CHARACTER
PRICE_INCREASE_AMOUNT NEXT * , O(") CHARACTER
value used for ROWS parameter changed from 64 to 43
Table XXOKS_CONTRACT_PRICE_INCR_DTLS:
43 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 255162 bytes(43 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 43
Total logical records rejected: 0
Total logical records discarded: 0
Run began on Thu Dec 03 01:32:08 2009
Run ended on Thu Dec 03 01:32:08 2009
Elapsed time was: 00:00:00.19
CPU time was: 00:00:00.04
Plz srini help me.
Thanks in advance
Rama.. -
SQL Loader deletes data, after reporting a file not found error
I have several control files beginning:
LOAD DATA
INFILE <dataFile>
REPLACE INTO TABLE <tableName>
FIELDS TERMINATED BY '<separator>'
When running SQL Loader, in the case of one particular control file, if the file referenced does not exist, SQL Loader first reports that the file could not be found, but then proceeds to delete all the data in the table in which the import was meant to take place. The corresponding log file reveals that the file could not be found, but also states that 0 records were loaded and 0 records were skipped.
In the case of all other control files, if the file is not found, the log files simply report this exception but do not show any statisitcs about the number of records loaded/skipped nor does SQL Loader delete the data in any of the referenced tables. This is obviously the expected behaviour.
Why is SQL Loader deleting the data referenced by one particular control file, even though this file does not exist and the corresponding log file has correctly picked up on this?in the ressource name box of your file model, when you push the search button ("...") do you see the file ?
Cause the problem can occur when you write directly the path without selectionning the file with the assistant.
Try this.
I think too that that you can't see the data by right clicking and selectionning View Data ?
Let me know the avancement... -
SQl loader not loading records
I have my control file like this
options (skip=1)
LOAD DATA
INFILE xxx.csv
into table xxx
TRUNCATE
FIELDS TERMINATED BY ',' optionally enclosed by '"'
RECORD_STATUS,
ITEM_NUMBER,
Sql loader not loading records and giving error like .......
Commit point reached - logical record count 14
Commit point reached - logical record count 26
Commit point reached - logical record count 84
Commit point reached - logical record count 92
and successfully loaded only 41 records among 420 records
Plz help meHI Phiri,
Thx for your reply.Here is the log file.
SQL*Loader: Release 8.0.6.3.0 - Production on Wed May 12 21:26:30 2010
(c) Copyright 1999 Oracle Corporation. All rights reserved.
Control File: saba_price_break_allcur_test.ctl
Data File: saba_price_break_allcur_test.csv
Bad File: saba_price_break_allcur_test.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 1
Errors allowed: 50
Bind array: 64 rows, maximum of 65536 bytes
Continuation: none specified
Path used: Conventional
Table SABA_PRICE_BREAK_ALLCUR_TEST, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
RECORD_STATUS FIRST * , O(") CHARACTER
ITEM_NUMBER NEXT * , O(") CHARACTER
PA1 NEXT * , O(") CHARACTER
PA2 NEXT * , O(") CHARACTER
UOM_CODE NEXT * , O(") CHARACTER
RANGE_PRICING NEXT * , O(") CHARACTER
RANGE_FROM NEXT * , O(") CHARACTER
RANGE_TO NEXT * , O(") CHARACTER
PRICING_ATTRIBUTE NEXT * , O(") CHARACTER
PRICING_METHOD NEXT * , O(") CHARACTER
PRICE_BREAK_LINE_NO NEXT * , O(") CHARACTER
TEMPLATE_NAME NEXT * , O(") CHARACTER
ITEM_DESC NEXT * , O(") CHARACTER
PRICE_USD NEXT * , O(") CHARACTER
PRICE_EUR NEXT * , O(") CHARACTER
PRICE_GBP NEXT * , O(") CHARACTER
PRICE_JPY NEXT * , O(") CHARACTER
GL_ACCOUNT NEXT * , O(") CHARACTER
LONG_DESC NEXT * , O(") CHARACTER
STATUS NEXT * , O(") CHARACTER
MESSAGE NEXT * , O(") CHARACTER
Record 12: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 13: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 27: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 28: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 29: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 30: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 31: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 32: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 33: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 34: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 35: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 36: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 37: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 38: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 39: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 40: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 41: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 42: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 43: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 44: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 45: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 46: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 47: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 48: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 49: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 50: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 51: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 52: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 53: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 54: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 55: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 56: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 57: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 58: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 59: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 60: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 61: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 62: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 63: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 64: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 65: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 66: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 67: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 68: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 69: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 70: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 73: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 74: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 87: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 91: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
Record 92: Rejected - Error on table SABA_PRICE_BREAK_ALLCUR_TEST, column LONG_DESC.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table SABA_PRICE_BREAK_ALLCUR_TEST:
41 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 65016 bytes(12 rows)
Space allocated for memory besides bind array: 0 bytes
Total logical records skipped: 1
Total logical records read: 92
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Wed May 12 21:26:30 2010
Run ended on Wed May 12 21:27:06 2010
Elapsed time was: 00:00:36.08
CPU time was: 00:00:00.00 -
I have Oracle 11g R2 running on Windows Server 2008 R2
In trying to load data (csv file) with SQL Loader in Enterprise Manager it tells me the job completed successfully, but there is no data in the table. The bat file is created as is the ctl and sh file. I have checked that sqldr is in my $ORACLE_HOME$\bin directory and I have checked that this "path" is also in my path environment variable. When I try to run from command line I get not recognized as an internal or external command, operable program or batch file. I try to run it in SQL Developer and after it runs it says task cancelled. Can you help. I am expecting a 70-90 million record file next week and I want to use SQL Loader.
Message in Enterprise Manager
Status Succeeded
Exit Code 0
Step ID 31327
Targets leads.global
Started May 10, 2013 3:55:14 PM (UTC-07:00)
Ended May 10, 2013 3:55:22 PM (UTC-07:00)
Step Elapsed Time 8 seconds
Management Service WIN-D1CINRVM11K:1158_Management_Service
TIP Management Service from which the job step was dispatched.
Output Log
Username:
SQL*Loader: Release 11.2.0.1.0 - Production on Fri May 10 15:55:15 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Load completed - logical record count 51.I usually run SQL*Loader from SQL*Plus, so that it avoids some of the privilege issues. I also use a log file, so that I can check that afterwards to see what happened and why. For example:
host sqlldr scott/tiger control=test.ctl log=test.log
A frequent issue is lack of privileges to the data file to be loaded. You can edit a blank file in SQL*Plus and copy and paste the data into it, to eliminate such problems or identify them, so that you can check privileges on the original file.
To provide much else, we would need more complete information, such as a few rows of sample data, your SQL*Loader control file, and your table structure. -
SQL Loader- Default value is not inserted in table by direct method
Hi All,
I am trying to load data from a file to a table by SQL Loader. There is a particular date column which I am not loading from the File but have used a Default constraint to put SYSDATE in it. When I am using direct method load then it is not inserting nothin in this column but in conventional method it is inserting SYSDATE. But I want to do the load by direct method. Can anybody provide me a solution as how to load default value in direct method?
Thanks & regards.
SudiptaFor this special case, you can specify:
column_name SYSDATE
in your SQL*Loader control file.
Maybe you are looking for
-
How can I see shared pc with firewall set to allow only essential services?
How can I see shared pc with firewall set to allow only essential services? So far if I set it as above then shared does not show up? I have to set the firewall to set access to specific applications and services to get access to my windows based har
-
Is it possible to create a wireless site to site link using 1260 access points as bridges (like the 1300 series)? Also if so, can you use 11n channel bonding on the bridge link? I need to link 2 buildings together wirelessly approximately 70 metres a
-
Every time i set up some app tabs, they disappear when I close ff. So upon reopen, have to set them up again, just to disappear again when i close ff. I thought once setup, they stay until deleted. Am I wrong on this? If they are supposed to show up
-
Can't find the captured screen file on my Desktop, Please help!
I'm using OS X 10.5.6 for a while but the problem just occurred recently (it worked fine before) I don't know why. I update the system often. I used the ShiftControl+Command3 or 4 shortcut but it doesn't work, I heard the shutter sound but the file d
-
Black print cartridge only, hp 5652, windows 7
Hi. Why can't we select "Black print cartridge only" for printer "hp deskjet 5652" in windows 7 (64bit)? In "Preferences" we see this: Finishing -> [x] Print on both sides Color -> [x] Print In Grayscale [ ] Black print cartridge only <<<<<<<Greyed o