Loading data with an automaitc process
Hello folks:
One of my customers send a access database *.mdb file.
Rigth now, a developer takes these file, open it, take the specific table and save its content into a csv file, and finally import its content into a temporary table, to process the data followingt certain rules.
But this is a manual process.
I want to develop a module to, open the mdb file, save the access table into a csv file and import its data into a tempary oracle table.
My questios are:
1. Can I open a mdb file within a form, using some command?
2. Can I read a csv file into a module and then use its content to load a table?
3. Can I show the csv file content into a forms bloack.
Thanks in advanced.
Abdel Miranda
AEMS Global Group
Panama
Edited by: aemsmeg on May 7, 2009 2:15 PM
Hello:
Basically I want to know if is possible to connect to ms access using my oracle forms.
But there are some facts you need to know:
1. I can not make any change to database configuration and/or setup.
2. we can not use heterogeneous services
I was reading some notes in these forum and in metalink, and some of them are talking about net 8 configuration, tsnames and odbc.
Long time ago [about 5 years] I setup my notebook to connect oracle forms to mysql. I can not remember how I did that but as long as I remember it was an easy way to do it.
But now, after reading all those articles and notes, I am not wuite sure if, connect my oracle forms [or my SQ*Plus] is as easy as I did with mysql.
Even more because in my notebook, I have an oracle developer tools, oracle database and some other oracle products in the same machine, so I can change, re-install, and do whatever I want with these machine.
But now, I am using my company machine and I only have installed: oracle forms [6i, 10g], Jdeveloper [133, 134, 11g], Oracle Objects for OLE and that's it.
Can I do what I want?
Thanks in advanced for all your help.
Abdel Miranda
AEMS Global Group
Panama
Similar Messages
-
Sql@loader-704 and ORA-12154: error messages when trying to load data with SQL Loader
I have a data base with two tables that is used by Apex 4.2. One table has 800,000 records . The other has 7 million records
The client recently upgraded from Apex 3.2 to Apex 4.2 . We exported/imported the data to the new location with no problems
The source of the data is an old mainframe system; I needed to make changes to the source data and then load the tables.
The first time I loaded the data i did it from a command line with SQL loader
Now when I try to load the data I get this message:
sql@loader-704 Internal error: ulconnect OCISERVERATTACH
ORA-12154: tns:could not resolve the connect identifier specified
I've searched for postings on these error message and they all seem to say that SQL Ldr can't find my TNSNAMES file.
I am able to connect and load data with SQL Developer; so SQL developer is able to find the TNSNAMES file
However SQL Developer will not let me load a file this big
I have also tried to load the file within Apex (SQL Workshop/ Utilities) but again, the file is too big.
So it seems like SQL Loader is the only option
I did find one post online that said to set an environment variable with the path to the TNSNAMES file, but that didn't work..
Not sure what else to try or where to look
thanksHi,
You must have more than one tnsnames file or multiple installations of oracle. What i suggest you do (as I'm sure will be mentioned in ed's link that you were already pointed at) is the following (* i assume you are on windows?)
open a command prompt
set TNS_ADMIN=PATH_TO_DIRECTOT_THAT_CONTAINS_CORRECT_TNSNAMES_FILE (i.e. something like set TNS_ADMIN=c:\oracle\network\admin)
This will tell oracle use the config files you find here and no others
then try sqlldr user/pass@db (in the same dos window)
see if that connects and let us know.
Cheers,
Harry
http://dbaharrison.blogspot.com -
SQL * Loader : Load data with format MM/DD/YYYY HH:MI:SS PM
Please advice how to load data with format MM/DD/YYYY HH:MI:SS PM into an Oracle Table using SQL * Loader.
- What format should I give in the control file?
- What would be the column type to create the table to load data.
Sample data below;
MM/DD/YYYY HH:MI:SS PM
12/9/2012 2:40:20 PM
11/29/2011 11:23:12 AM
Thanks in advance
AvinashHello Srini,
I had tried with the creation date as DATE datatype but i had got an error as
ORA-01830: date format picture ends before converting entire input stringI am running the SQL*LOADER from Oracle R12 EBS front-end.
the contents of my control file is
LOAD DATA
INFILE "$_FileName"
REPLACE
INTO TABLE po_recp_int_lines_stg
WHEN (01) = 'L'
FIELDS TERMINATED BY "|"
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
INDICATOR POSITION(1) CHAR,
TRANSACTION_MODE "TRIM(:TRANSACTION_MODE)",
RECEIPT_NUMBER "TRIM(:RECEIPT_NUMBER)",
INTERFACE_SOURCE "TRIM(:INTERFACE_SOURCE)",
RECEIPT_DATE "TO_CHAR(TO_DATE(:RECEIPT_DATE,'MM/DD/YYYY'),'DD-MON-YYYY')",
QUANTITY "TRIM(:QUANTITY)",
PO_NUMBER "TRIM(:PO_NUMBER)",
PO_LINE_NUMBER "TRIM(:PO_LINE_NUMBER)",
CREATION_DATE "TO_CHAR(TO_DATE(:CREATION_DATE,'MM/DD/YYYY HH:MI:SS AM'),'DD-MON-YYYY HH:MI:SS AM')",
ERROR_MESSAGE "TRIM(:ERROR_MESSAGE)",
PROCESS_FLAG CONSTANT 'N',
CREATED_BY "fnd_global.user_id",
LAST_UPDATE_DATE SYSDATE,
LAST_UPDATED_BY "fnd_global.user_id"
{code}
My data file goes like
{code}
H|CREATE|123|ABC|12/10/2012||||
L|CREATE|123|ABC|12/10/2012|100|PO12345|1|12/9/2012 2:40:20 PM
L|CORRECT|123|ABC|12/10/2012|150|PO12346|2|11/29/2011 11:23:12 AM{code}
Below is the desc of the table
{code}
INDICATOR VARCHAR2 (1 Byte)
TRANSACTION_MODE VARCHAR2 (10 Byte)
RECEIPT_NUMBER NUMBER
INTERFACE_SOURCE VARCHAR2 (20 Byte)
RECEIPT_DATE DATE
QUANTITY NUMBER
PO_NUMBER VARCHAR2 (15 Byte)
PO_LINE_NUMBER NUMBER
CREATION_DATE TIMESTAMP(0)
ERROR_MESSAGE VARCHAR2 (4000 Byte)
PROCESS_FLAG VARCHAR2 (5 Byte)
CREATED_BY NUMBER
LAST_UPDATE_DATE DATE
LAST_UPDATED_BY NUMBER {code}
Thanks,
Avinash -
How can I load data with Scripts on *FDM* to a HFM target System????
Hi all!
I need help because I can´t find a good guide about scripting on FDM. The problem I have is the next one.
I have on mind to load my data with data load file in FDM to a HFM target system, but I would like to load an additional data using an event script, ie after validate. I would need any way to access to HFM system though FDM Scripts, is it possible??
If so, It would be wonderful to get a data from HFM with any Point of View, reachable from FDM Scripts in order to load or getting any data.
I´ve looking for a good guide about scripting in FDM but I couldn´t find any information about accessing data on HFM target system, does it really exist?
Thanks for helpHi,
Take a look at the LOAD Action scripts of your adapter. This might give you an idea.
Theoretically it should be possible to load data in an additional load, but you need to be very careful. You don't want to corrupt any of the log and status information that is being stored during the load process. The audit trail is an important feature in many implementations. In this context it might not be a good idea to improve automation and risk compliance of your system.
Regards,
Matt -
Loading data with data workshop
Hello,
We load lots of .csv files into apex using the Dataworkshop tool in Apex. Most .csv files are 60-75K rows of data. It used to take just minutes and was quick and easy way to load data. However, recently it has started taking much much longer (as in several hours vs. 5 or 6 minutes normally) to load data this way into production.
Any ideas of why all of a sudden it takes much longer to load data this way? Is there anything that can be tuned?
We are running Application Express 4.1.1.00.23 on RH 5 linux with 11.2.0.2 database.
Thank you for any help you can provide.
Thank you,
MarkMehabub,
Thank you for responding. In a sense that is kind of what we are doing. We truncate the table and upload our .csv to this table and then process into our other tables.
The table does not have a particularly large number of rows (usually 60,000 - 70,000 rows), but does have 112 columns.
It all seems to work fine in our test environment, but there are not the number of users there as in production. We have been looking at the SGA parameters and seem to make headway, but then the customer tries to upload and it slows way way down in production.
Our customers really like this functionality we have given them with apex, but I am running out of ideas of what to look at.
Thank you,
Mark -
Hi everyone,
I trying to load data from a DSO to a INFOCUBE, the problem is as follow. Normally in a DTP the system process packages in the request. In this case the DTP don't proccess any package but finish with status green.
When I manage the target, I see that the request finish in red.
The symptom is that the DTP don't process any request. I don´t know why.
Status: The DSO have active data, and the infocube is empty.
I don't know what happen, but I've not carry the data from the DSO to the target.
I hope that you help me,
Regards.
JoseAnswering your questions:
Are you seeing the request red in Manage and Green in Monitor Details ?
Yes in manage I see the requests in red, but in the monitor details all is in green, although there are not requests processed in the monitor details..
Are the requests green and active in the source DSO ?
Ok, this DSO is direct input, an it's filled by a APD process. So the DSO don't have active request. This DSO Just have the data in table of active data. I think that it doesn't matter because I'm trying to do the same thing between two infocubes and doesn't work too.
I'm looking that the problem y general for all warehouse.
Additional, this DTP works correct the last week.
This week begins to fail.
Could be a System problem.
What can I check??
Also check if this is an auth issue - SU53 ..
I check SU53 but all it's ok.
In Monitor Header - Selections - You should see the requests which have been loaded from source.
I don't see anything in this field.
Thanks
Jose -
Want to load data with selection criteria
Hi Everyone,
I want to load data from X ODS to Y ODS , X ods is a datasource to Y ods.
On Y ods i don't have any data loaded.
But on X ods i am having 10 requests with 200000 records.
In X ods i am having a request with 2 records.
I want to load the 2 records request to Y ODS.to check data on Y ods
Can anyone help me in solving these, b'cos i am new to BW, It's urgent Please.
Can you tell me step by step navigation.Hi,
Just select Full upload; it will bring the InfoPackage and then in the Selection tab give the range value. If this is required only one time then this method is fine or full load is fine, otherwise you will have to write a code to pick the records. If you frequently want to load data from one ODS to another ODS then better go for init and then from next time onwards do the delta load. If you don't want to provide selection in the InfoPackage then the other way is to load all the data from X to Y and do selective deletion on Y ODS.
Hope this helps.
PB -
How to load data with carriage return through DRM action script ?
Hello,
We are using DRM to manage Essbase metadata. These metadata contain a field for member formula.
Currently it's a string data type property in DRM so we can't use carriage return and our formula are really hard to read.
But DRM support other data type property : memo or formatted memo where we can use carriage return.
Then in the export file, we have change the record delimiter to an other character than CRLF
Our issue : we are regularly using action script to load new metadata => How to load data properties with carriage return using action script ? There is no option to change the record delimiter.
Thanks!Hello Sandeep
here what I want to do through action script : loading a formula that use more on than one line:
Here, I write my formula using 4 lines but action script cannot load since one line = 1 record.
ChangeProp|Version_name|Hier_name|Node_name|Formula|@round(
qty*price
*m05*fy13 -
Load data with SQL Loader link field between CSV file and Control File
Hi all,
in a SQL Loader control file, how do you specify link with field in CSV file and Control file?
E.g. if I wat to import the record in table TEST (col1, col2, col3) with data in csv file BUT in different position. How to do this?
FILE CSV (with variable position):
test1;prova;pippo;Ferrari;
xx;yy;hello;by;
In the table TEST i want that col1 = 'prova' (xx),
col2 = 'Ferrari' (yy)
col3 = default N
the others data in CSV file are ignored.
so:
load data
infile 'TEST.CSV'
into table TEST
fields terminated by ';'
col1 ?????,
col2 ?????,
col3 CONSTANT "N"
Thanks,
AttilioWith '?' mark i mean " How i can link this COL1 with column in csv file ? "
Attilio -
Hello Expert,
I loaded data from SAP R/3 into PSA using Delta mode and found that a new record which was just created was not loaded into SAP BI. What could be possible?
Step of loading
1. Initial without data into PSA
2. Full load with criteria Fiscal Period (e.g. 001.2009 - 016.2009) into PSA
3. Load data from PSA into DSO with update mode = Delta
4. Create a new transaction from SAP R/3
5. Load data from SAP R/3 into SAP BI-PSA with update mode = Delta
Expected Result: A new record should be loaded into PSA.
Actual Result: There was no record loaded.
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
Is anyone familiar with this kind of problem? How did you resolve it?
Any suggestion would be appreciated.
Thank you very much
W.J.Hi,
Is your Datasource is Logistics? if so how the job has been scheduled in LO cockpit (hourly / daily)
Did you check the record in Delta Queue(RSA7) ??
After initial loading without data, is it necessary to full load with all fiscal period. And then load with Delta mode?
I have never seen this kind of problem before.
If you have specific selections you have to do it by repair full load .. and it doesn't impact delta loads -
Sql loader Need to load data with "," only in one filed
Hi,
I need to load data my in one column my data is in CSV format like this
Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
i want to upload this data on my table which contain only one column which is name ?
What will be the query for upload data through sql loader
Thanks
Shahzaib ismail
Oracle database Express Edition Developer 6ISince you mention you're using database version XE, I'll assume you're database version is at least 10.2
SQL> select * from v$version;
BANNER
Oracle Database 10g Express Edition Release 10.2.0.1.0 - Productand so you have the power of:
- external tables
http://www.oracle-base.com/articles/9i/ExternalTables9i.php
http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6611962171229
- regular expressions
http://nuijten.blogspot.com/2009/07/splitting-comma-delimited-string-regexp.html
and you don't want to be using SQL*Loader anymore, never ever.
I simply put your string 'Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan' in a file called test.csv and told Oracle that file is in my Oracle directory DATA_DIR (that actually points to: c:\data on my 'filesystem' ) and then:
SQL> create table t(name varchar2(155));
Table created.
SQL> -- instead of SQL*Loader use an External Table:
SQL> create table ext_t
2 ( textstring varchar2(4000)
3 )
4 organization external ( type oracle_loader
5 default directory DATA_DIR
6 access parameters (fields terminated by '' )
7 location ('test.csv')
8 );
Table created.
SQL> -- Now you can query your file as if it were a table!
SQL> select * from ext_t;
TEXTSTRING
Shahzaib ismail, Imran aziz, Shahmir mehmood, Shahzad khan
1 row selected.
SQL> -- and use the powers of SQL to do whatever you want (instead of cludging with those dreaded ctl files):
SQL> select regexp_substr (textstring, '[^,]+', 1, rownum) names
2 from ext_t
3 connect by level <= length(regexp_replace(textstring, '[^,]+'))+1;
NAMES
Shahzaib ismail
Imran aziz
Shahmir mehmood
Shahzad khan
4 rows selected.
SQL> -- Voilà, the data is loaded into the table in one single SQL statement:
SQL> insert into t
2 select trim(names)
3 from ( select regexp_substr (textstring, '[^,]+', 1, rownum) names
4 from ext_t
5 connect by level <= length(regexp_replace(textstring, '[^,]+'))+1
6 );
4 rows created.
SQL> --
SQL> select * from t;
NAME
Shahzaib ismail
Imran aziz
Shahmir mehmood
Shahzad khan
4 rows selected.Don't use SQL*Loader, use an External Table. -
Unable to load data with impdp
Hi friends,
I've encountered with follwing errors while loading data through impdp utility.
ORA-31626: job does not exist
ORA-31633: unable to create master table "TRACE.SYS_IMPORT_FULL_05"
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
ORA-06512: at "SYS.KUPV$FT", line 863
ORA-00955: insufficient privileges
I think problem is with last line ORA-00955: insufficient privileges what are you opinion kindly tell me what necessary priviliges user should have to import/export dump file.
Looking for you Help and suggestion
Regards,
AbbasiIs this dumpfile consists of onlyTRACE schema objects or other schema objects?
no need to grant dba priviliges to trace, you can import using sys/system user.
impdp system/****@TNRDB directory=tnr_dump_dir dumpfile-tnrsms.dmp logfile=loading.logThanks -
Requirement to load data with extension as .inv.dhl of source to target as .csv
Hi Rajendra, Though the file extension says it is of type .inv.dhl, the actual contents of the file is of type XML and it's valid too as said by AnjiReddy. I would suggest import/create your source definition from the source file by using 'Source > Import from XML Definition' menu option and use xml parser tx and go ahead with your rest of business logic within the mapping and connect to the CSV file target. Please try and let us know if you need help at anypoint or run into issues,Rajani
Hi Friends, I have got one new requirement to load data from .inv.dhl file to target as .csv i got a file from our customer with extension of .inv.dhl in single file want to load the data available in that file to target as .csv with all information Below is the input file and desired output files. Kindly help me in this implementation of this file. Regards,Rajendra
-
Tutorials on loading data with SQL LOADER
Dear reader
Pls i need tutorials on how to use SQL LOADER as well as a software for loading data into oracle softwarehttp://download.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm
Chapters 6-14 are filled with examples and tutorials -
Load Data with 7.0 DataSource from Falt file to Write Optimized DSO
Hi all,
we have a problem loading data from flat file using the 7.0 datasource.
We have to load a flat file (monthly) into a WO DSO. The infopackage load file in full mode into the Datasource (PSA) and the DTP Load in delta mode data from datasource into the WO DSO.
When i load the second file in the Datasource, the DTP load all data present in the Datasource and not only the new one as aspected using Delta mode.
Has anyone any tips to help me?
Thank you for help.
Regards
EmilianoHi,
Iam facing the similar problem.
Iam using Write Optimized DSO and i have got only 1 req in PSA (have deleted all previous req from PSA and DSO).
When iam doing a delta load from PSA to DSO, i expect to see only that 1 req to get loaded into DSO.
But, its picking up the data from 3 other reqests and doubling the records...
Can you please help me, how did you managed to get out of that isue?
Cheers,
Nisha
Maybe you are looking for
-
JDK 6: requestPasswordAuthentication called multiple times
Hi... Has anyone else noticed that 'requestPasswordAuthentication' gets called multiple times from HTTPURLConnection in JDK 6? If I create an HTTPURLConnection, and it returns successfully from 'openConnection', if I then do something simple like que
-
I have a problem with the coloring of my images when I convert my images from RAW to JPEG. I work with a professional photographer on the weekends doing weddings and when doing so, I always use the RAW format. I use his cards since it is his business
-
Leopard MDC with Tiger Clients?
I'm well aware of the rule which states MDCs should be the same as, or no more than one version ahead of, the cleints. So far so good. Does this mean I can update the MDCs to 10.5.1 server whilst leaving clients on 10.4.11 for the time being? Trying
-
Corrupted Email Archive ?
We have a 3GB email backup file that we're trying to migrate to another server. The imsrestore command restores the first email, then exits: $ du -sh 060808B.email 2.9G 060808B.email $ cat 060808B.email | /usr/iplanet/server5/bin/msg/store/bin/imsres
-
Regenerate IDOC for EEWB fields
Hi All I have added a new field via EEWB and regenerated the IDOC using transaction BDFG (to ensure that the new field appears in the IDOC for population). In the new message type, I see the additional field added there. I tried testing the IDOC in W