Export data with Dreamweaver (TXT, CSV)
Hi to all of the Adobe community.
My question to me is a little hard for my level of knowledge.
I'm using Dreamweaver using technology PHP / MySQL, in my database have a table in MySQL called mailing list, with the fields name and email.
I have a php page that I developed using the Dynamic Table where it returns me a list of names and e-mail registrations. What I would like to make is that at the end of this table I have a link or a button called export data. I wish that when you export this data be txt or csv format.
How?
Export data and further define the format to be exported.
I'm using Dreamweaver CS4.
Rodrigo Vieira da Silva Eufrasio
E-mail: rodrigo.mct @ gmail.com
Mobile: +55 11 8183-9484
Brazil - Osasco - SP
It is doable.
Assume for the moment that you are not paging the query results (that is, you are display ALL results at one time).
You would need to do the following when the button is clicked (the action calling a separate file for the processing):
Open a file on the server for writing ($ofile = fopen("data.txt","w");)
Repeat your query but instead of echo $row_Record set.... you would use fwrite($ofile, $row_Recordset...
In between fields you would need fwrite($ofile, "\t") to put in a tab or fwrite($ofile, ",") for a CSV,
At the end of every line you would need a <cr><lf>: fwrite ($ofile, "\r\n");
Close the file - fclose ($ofile);
Then use the header function to force the download of this file like this.
header('Content-disposition: attachment; filename=data.txt');
header('Content-type: text');
readfile('data.txt);
The header function has to be the first thing sent to the user. Any white space would cause it to fail. I haven't tried this part after a DB query, but it should work. If it doesn't you will have to invoke the header routine in a separate file.
Hope this helps.
Barry
Similar Messages
-
Importing and Exporting Data with a Clob datatype with HTML DB
I would like to know what to do and what to be aware of when Importing and Exporting data with a Clob Datatype with HTML DB?
Colin - what kind of import/export operation would that be, which pages are you referring to?
Scott -
How to export data with column headers in sql server 2008 with bcp command?
Hi all,
I want know "how to export data with column headers in sql server 2008 with bcp command", I know how to import data with import and export wizard. when i
am trying to import data with bcp command data has been copied but column names are not came.
I am using the below query:-
EXEC master..xp_cmdshell
'BCP "SELECT * FROM [tempdb].[dbo].[VBAS_ErrorLog] " QUERYOUT "D:\Temp\SQLServer.log" -c -t , -T -S SERVER-A'
Thanks,
SAAD.Hi All,
I have done as per your suggestion but here i have face the below problem, in print statment it give correct query, in EXEC ( EXEC master..xp_cmdshell @BCPCMD) it was displayed error message like below
DECLARE @BCPCMD
nvarchar(4000)
DECLARE @BCPCMD1
nvarchar(4000)
DECLARE @BCPCMD2
nvarchar(4000)
DECLARE @SQLEXPRESS
varchar(50)
DECLARE @filepath
nvarchar(150),@SQLServer
varchar(50)
SET @filepath
= N'"D:\Temp\LDH_SQLErrorlog_'+CAST(YEAR(GETDATE())
as varchar(4))
+RIGHT('00'+CAST(MONTH(GETDATE())
as varchar(2)),2)
+RIGHT('00'+CAST(DAY(GETDATE())
as varchar(2)),2)+'.log" '
Set @SQLServer
=(SELECT
@@SERVERNAME)
SELECT @BCPCMD1
= '''BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT '
SELECT @BCPCMD2
= '-c -t , -T -S '
+ @SQLServer +
SET @BCPCMD
= @BCPCMD1+ @filepath
+ @BCPCMD2
Print @BCPCMD
-- Print out below
'BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername'
EXEC
master..xp_cmdshell
@BCPCMD
''BCP' is not recognized as an internal or external command,
operable program or batch file.
NULL
if i copy the print ourt put like below and excecute the CMD it was working fine, could you please suggest me what is the problem in above query.
EXEC
master..xp_cmdshell
'BCP "SELECT * FROM
[tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername '
Thanks, SAAD. -
SQL Developer 3.2 - Exporting data with TIMESTAMP datatype
Hi,
We have users that are attempting to export data with the Timestamp format to Excel (xls format) using SQL Developer 3.2. When attempting to sort the Timestamp in either asc or desc order, Excel is having issues sorting correctly. I suggested that the user just do all the sorting within their SQL Developer session but they require the ability to slice and dice in Excel.
This is definitely not an issue with Excel as the users have previously exported Timestamp data from Toad and been able to sort without issue. Any thoughts as to what might resolve this issue?
Thanks.We're not formatting timestamps in Oracle as numbers/dates in Excel. They'll need to properly format the Excel column/cells to get it to sort the way they want vs being treated as simple strings.
-
Best way to export data with r.t. prompts and have dense dim mbrs on rows?
Hi All-
What is the best way to export data with Run time prompts out of Essbase?
One thought was to use Business Rules with run time variables and DATAEXPORT command, but I came across at least one limitation where I cannot have months (part of dense Time Periods dimension) on rows.
I have only two dense dimensions: Accounts and Time Periods and I need both of these on rows. This would come handy when user enter Start and End month and year for data to be exported e.g. If start period is Feb 2010 and end is Jan 2011, I get data for all months in 2010 and 2011.
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000",14202.24,14341.62,14560,13557.54,11711.92,10261.58,12540.31,15307.83,16232.88,17054.62,18121.76,18236
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000",19241,21372.84,21008.4,18952.75,23442.13,19938.18,22689.61,23729.29,22807.48,23365,23915.3,24253
"CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000",21364,22970.37,23186,27302,25144.38,27847.91,27632.11,29007.39,24749.42,27183.39,26599,27112.79
where ideally I would need to get the following:
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Feb",14341.62
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Mar",14560
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Apr",13557.54
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","May",11711.92
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jun",10261.58
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Jul",12540.31
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Aug",15307.83
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Sep",16232.88
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Oct",17054.62
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Nov",18121.76
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","402000","Dec",18236
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Feb",21372.84
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Mar",21008.4,
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Apr",18952.75
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","May",23442.13
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jun",19938.18
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Jul",22689.61
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Aug",23729.29
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Sep",22807.48
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Oct",23365
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Nov",23915.3
"CORP1","0173","FY10","Working","Budget","Local","HSP_InputValue","403000","Dec",24253
"CORP1","0173","FY11","Working","Budget","Local","HSP_InputValue","404000","Jan",21364
Thank you in advance for any tips.Have a read of the following post :- export data to sql
It may give you a further option.
Cheers
John
http://john-goodwin.blogspot.com/ -
Best practice to export TV Guide to TXT / CSV files
Dear All,
I have working TV Guide exporting to XML using Oracle 10g. Now I have a request to allow to export it to a TXT file delemeted by commas or tabs.
Thought using of External Tables will do the work, same as I import the data, but seems that I am wrong - it excepts only ORACLE_DATAPUMP type,
so the file can't be seen in a simple notepad / text editor.
What shall I do? Is there any documents I can read about?
Many thanks in advance!3360 wrote:
BluShadow wrote:
3360 wrote:
From 10g onwards you can write to external tables. I'd be glad to be proved wrong, I don't have 10g handy at the moment, but as I recall, you could only write to external tables in data pump format, which I think is some sort of binary wrapped xml format similar to export format, not plain text.Yes, my misunderstanding of the OP's post. (see above)
The dump file set is made up of one or more disk files that contain table data, database object metadata, and control information. The files are written in a proprietary, binary format, which can be read only by Data Pump Import.That's not strictly true. You can open them up in notepad/wordpad and clearly see the data in XML format. Ok, it's got some binary rubbish around it, but the data is clear.I think the quote from the manuals refers to systems that can read the file, rather than visually reading it on a screen. What is it with XML that people think 1) they can read it and therefore 2) computer programs must be able to read it too?I never said that. ;)
The issue was about being able to open it in notepad, not about a computer program being able to read it.
>
>>
Spooling generally seems the best option for writing plain text out of the database.Ewwww! not my best option. I prefer to keep things in the database rather than rely on external utilities like SQL*Plus. UTL_FILE generally does the job nicely.. otherwise write it all into a clob and use one of the various methods of writing out clobs to do it, but UTL_FILE is the simplest.Unless the stored procedure runs as a database job or schedule, something with an IO interface will be calling it. Compared to spooling UTL_FILE requires more code, is slower since it needs to loop and write a row at a time, can only write to the db server and needs directory objects, so it is both more complicated and limited.Spooling also has a nasty habit of adding an extra blank line on the end of the file. ;) -
Hello.
It is necessary to export BPC transactional data by means of a standard export chain.
How can I to customize filter use for export?
Order my actions:
1.Run the Export package with Data Manager.
2.Choose a file name.
3. Choose the dimension. I save a filter by one of attributes. Manage filter name (Button "Manage Filter"-
MYFILTER - Save). Refresh.
Button "Copy all Members". In window with dimension name I see the text [FILTER: MYFILTER].
4. I execute the export package. But csv-file contains only a string with names of dimensions, no transactional data.
How can I solve this problem?
P.S. SAP BPC 7.0 for Netweaver, Support Packages CPMBPC level - 04.
Thanks!
Best Regards,
SvetlanaHi Andrea,
I don't think you have to support many version of ABAP code - one for each variable.
If your OLAP variable has type Customer Exit you can call SAP function RRS_VAR_VALUES_EXIT_FILL which has variable value as a parameter. You can save that value as a variant of ABAP process type. This way you'll have just one program to support, but as many variants as many variables you need.
Hope this helps,
Gersh -
How to export data with months on columns?
Folks,
I need to export data from BPC NW in this format:
ENTITY,ACCOUNT,JAN,FEB,MAR,APR,MAY,JUN,JUL,AUG,SEP,OCT,NOV,DEC
Does anybody has a sample transformation file or guidelines on how to achieve it?
I tried *MVAL without luck.
Thanks
PauloHi Paulo,
*MVAL could have been helpful while importing and if you would have been converting key figure model to account based model. However, you are trying to do complete opposite. You are exporting data and converting account based model to key figure model. I would suggest you to make use of BADI start an end routines for this.
Hope this helps. -
Hi,
i'm facing a problem while exporting Data into a flatfile. Export itself is working correct but i'd need to export the data as nchar instead of char otherwise special characters arnt shown properly.
i know that oracle has the possibility to use UTL_FILE.FOPEN_NCHAR for such tasks but i cant find any possibilty within OWB.
Does anyone has a idea how to force OWB to use unicode?
ps. i tired to change the characterset from the FF operator without success.
thxHi
You could use a table function as a target as in this post....
http://blogs.oracle.com/warehousebuilder/entry/parallel_unload_to_file
Cheers
David -
Export data with definided fields terminated and rows terminate as bcp
hi to every one
i am dummy oracle and i need to test some database.
So i need to export some data in txt file and use @@@@@ as fields terminateed and ##### as rows terminated.
In mssql and in sybase there is a bcp command that by the client and cmd command u can estract the dataset like u want,
THERE IS SOMETHING LIKE THIS IN ORACLE
With the jdevelop u can export but not define all the rules.
thank a lotBasically oracle exports are binairy but you could query the table and spool the output to a textfile with sqlplus.
sqlplys user/pw
spool c:\outputfile.txt
select '######' || col1 || '@@@@@' || col2
from table
spool off
exit -
Statement terminator problems when exporting data with SQL Developer 3.2
I've ran across what appears to be a bug with SQL Developer 3.2.20.09. If someone can let me know if this is indeed a bug, or just something I need to configure differently, please let me know.
The problem is related to exporting a database as a SQL script, and terminator characters at the end of "create" statements, especially when columns have comments, and problems that occur when using SQLPlus to run the export script to create the tables. With the old SQL Developer 1.5.4, with the "Terminator" and "Pretty Print" options checked, statements like the following are generated:
-- DDL for Type NUM_ARRAY
CREATE OR REPLACE TYPE "NUM_ARRAY"
IS TABLE OF NUMBER(20)
-- DDL for Sequence MYTABLE_SEQ
CREATE SEQUENCE "MYTABLE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999999999 INCREMENT BY 1 START WITH 1 CACHE 20 NOORDER NOCYCLE ;
-- DDL for Table MYTABLE
CREATE TABLE "MYTABLE"
( "MYTABLE_ID" NUMBER,
"COL2" NUMBER,
"COL3" NUMBER
-- DDL for Table ANOTHERTABLE
CREATE TABLE "ANOTHERTABLE"
( "ANOTHERTABLE_ID" NUMBER,
"COL2" VARCHAR2(1024),
"COL3" VARCHAR2(1024)
COMMENT ON COLUMN "ANOTHERTABLE"."ANOTHERTABLE_ID" IS 'This is a comment.';
When I then run the script using SQLPlus, everything works fine. However, with SQL Developer 3.2.20.09, with the same options enabled, the same statements are generated like this:
-- DDL for Type NUM_ARRAY
CREATE OR REPLACE TYPE "NUM_ARRAY"
IS TABLE OF NUMBER(20)
-- DDL for Sequence MYTABLE_SEQ
CREATE SEQUENCE "MYTABLE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999999999 INCREMENT BY 1 START WITH 1 CACHE 20 NOORDER NOCYCLE ;
-- DDL for Table MYTABLE
CREATE TABLE "MYTABLE"
( "MYTABLE_ID" NUMBER,
"COL2" NUMBER,
"COL3" NUMBER
-- DDL for Table ANOTHERTABLE
CREATE TABLE "ANOTHERTABLE"
( "ANOTHERTABLE_ID" NUMBER,
"COL2" VARCHAR2(1024),
"COL3" VARCHAR2(1024)
COMMENT ON COLUMN "ANOTHERTABLE"."ANOTHERTABLE_ID" IS 'This is a comment.';
Notice all of the extra slashes in there. If a slash is not used as a statement terminator, SQLPlus treats slashes as a command to repeat the last SQL statement, which causes many errors about tables or sequences already existing. So, I tried removing the "Terminator" flag from the export options. This lead to statements that looked a bit more promising:
-- DDL for Type NUM_ARRAY
CREATE OR REPLACE TYPE "NUM_ARRAY"
IS TABLE OF NUMBER(20)
-- DDL for Sequence MYTABLE_SEQ
CREATE SEQUENCE "MYTABLE_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999999999 INCREMENT BY 1 START WITH 1 CACHE 20 NOORDER NOCYCLE
-- DDL for Table MYTABLE
CREATE TABLE "MYTABLE"
( "MYTABLE_ID" NUMBER,
"COL2" NUMBER,
"COL3" NUMBER
-- DDL for Table ANOTHERTABLE
CREATE TABLE "ANOTHERTABLE"
( "ANOTHERTABLE_ID" NUMBER,
"COL2" VARCHAR2(1024),
"COL3" VARCHAR2(1024)
COMMENT ON COLUMN "ANOTHERTABLE"."ANOTHERTABLE_ID" IS 'This is a comment.'
The big problem, though, is in the statement for the table with a comment. Notice that there are two statements, but there is not a semicolon after either of them. This unfortunately causes the "COMMENT" statement to be appended to the "CREATE TABLE" statement before being executed, which causes the table to not be created (which causes even more errors later on when the script attempts to populate the table with data).
So, it would appear that this is a bug, but I'm not sure. Is there a way I can configure the export options to make SQL Developer export these statements like it used to in older versions?
Thanks,
-Bill>
So, it would appear that this is a bug, but I'm not sure.
>
That would be a bug. Thanks for reporting it and providing the detailed example.
>
Is there a way I can configure the export options to make SQL Developer export these statements like it used to in older versions?
>
No.
Leave the thread open. One of the developers for sql developer should be monitoring the forum and can provide more information. -
Export data with where condition
Hello,
I am doing an export using exp utility in oracle.
exp fas/fas@fdbl file=aud log=aud.log parfile=exp
Contents of parfile - exp
compress=n
indexes=n
constraints=n
grants=n
triggers=n
statistics=none
consistent=y
query=\"where org_grp_i=33 \"
I am getting error when i include query = \"where org_grp_i=33 \". Without giving this condition the export is doing fine, its successful. how do i give this where condition in the parfile since i want to export lots of tables with this same condition. org_grp_i is common in all the tables.
Thanks1. create directory ----
sql> conn / as sysdba
sql>create directory "data" as 'c:\';
2. grant read, write privs. to exp or imp users.
sql>grant read , write on directory data to scott;
sql>grant read , write on directory data to tester;
3.conn imp or exp user
sql>conn scott/tiger
SQL> host expdp scott/tiger directory=data dumpfile=eg.dmp tables=avgsal
Export: Release 10.1.0.2.0 - Production on Wednesday, 25 October, 2006 19:14
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
Starting "SCOTT"."SYS_EXPORT_TABLE_01": scott/******** directory=data dumpfile=
eg.dmp tables=avgsal
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TBL_TABLE_DATA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."AVGSAL" 5.312 KB 5 rows
Master table "SCOTT"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
Dump file set for SCOTT.SYS_EXPORT_TABLE_01 is:
C:\EG.DMP
Job "SCOTT"."SYS_EXPORT_TABLE_01" successfully completed at 19:15
SQL> drop table avgsal purge;
Table dropped.
SQL> host impdp tester/tester directory=data dumpfile=eg.dmp
Import: Release 10.1.0.2.0 - Production on Wednesday, 25 October, 2006 19:16
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Produc
tion
With the Partitioning, OLAP and Data Mining options
Master table "TESTER"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "TESTER"."SYS_IMPORT_FULL_01": tester/******** directory=data dumpfile
=eg.dmp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TBL_TABLE_DATA/TABLE/TABLE_DATA
. . imported "SCOTT"."AVGSAL" 5.312 KB 5 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "TESTER"."SYS_IMPORT_FULL_01" successfully completed at 19:16
SQL>
note : cmd>impdp help=y
cmd> expdp help=y
see doc. for more info
http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/toc.htm
Message was edited by:
user52 -
How to export date with metadata
How do I include date when exporting metadata (not with the image).
I can't get Aperture (3.1.2) to export any kind of date. It shows the date field field, but the value is blank. Using Automator I can extract and export IXIF date which includes the time. I'd like to be able to export the date without the time code.
Thanks for any help.In the Metadata tab of the Inspector, choose IPTC Core. Then, in the Image brick, click in the Date Created field to set it. See if the date now exports with your metadata.
Let us know if that works. Thanks -
Export data with list alias from alternate alias table
Hello,
I am working on a data export out of my BSO app using @JExport.
The problem is I need the default alias, alternate alias, data listed for all entities.
I can achieve the default alias by listing @ALIAS(@CURRMBR("Entity"))
How can I list the alternate alias?
Any ideas?
Thanks,
NimaThats what I thought...just wanted to recofirm...guess I will have to go with the MAXL option and run the calc-script 2 times- first time with the default table and second time with the alternate alias table- will have 2 separate files but can merge them later on...
Thanks for your reply.
Nima -
How export datas with special characters from SQL Developer?
Hi.
I'm doing an import of datas of a table, but this table have special characteres in specific accents (á,é,í,ó,ú), my source table have for example "QRCN Querétaro, Candiles" but when I done an export from opcion Tool --> Export DLL (and Datas) from SQL Developer generate the next script
Insert into tablexxx(CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Quer?ro, Candiles');
How can I do for export my datas and generate the script correct?
Insert into tablexxx(CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Querétaro, Candiles');
thanks.Hi sybrand_b,
1. In my SQL Developer I select Tool-->Export DDL (and Data).
2. I Select name file, connection (this is a remote DB), objects to export in this case I select 'Tables and data' and table name to export
3. Run the procedure and generate the script following:
-- File created - jueves-julio-01-2010
-- DDL for Table TABLEXXX
CREATE TABLE "BOLINF"."TABLEXXX"
( "CADENA" VARCHAR2(50 BYTE),
"NUMERO_FARMACIA" VARCHAR2(50 BYTE),
"SUCURSAL_REFERENCIA" VARCHAR2(200 BYTE)
-- DATA FOR TABLE TABLEXXX
-- FILTER = none used
REM INSERTING into TABLEXXX
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Quer?ro, Candiles');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20281','QRCG Quer?ro, Corregidora');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20282','QRFU');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20283','QRFU');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20284','SAUN San Lu?P, Universidad');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20285','SAEV San Lu?P, Eje Vial');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20286','SALB San Lu?P, Los Bravo');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20287','SAAL San Lu?P, Alvaro Obreg?');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20288','SACA San Lu? Callej?n de Cod');
4. But my source table have the next datas.
Select * from TABLEXXX.
CADENA NUMERO_FARMACIA SUCURSAL_REFERENCIA
C002 20280 QRCN Querétaro, Candiles
C002 20281 QRCG Querétaro, Corregidora
C002 20282 QRFU
C002 20283 QRFU
C002 20284 SAUN San Luís P, Universidad
C002 20285 SAEV San Luís P, Eje Vial
C002 20286 SALB San Luís P, Los Bravo
C002 20287 SAAL San Luís P, Alvaro Obregó
C002 20288 SACA San Luís, Callejón de Cod
5. I have done a query to table nls_database_parameters.
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZH:TZM
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZH:TZM
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_RDBMS_VERSION 10.2.0.4.0
6. I have revised in Regedit-->HKEY_LOCAL_MACHINE-->SOFTWARE-->ORACLE-->ORACLE HOME and value for NLS_LANG=AMERICAN_AMERICA.UTF8
where should I change for export my datas correct?
or exist any form for export my datas?
thanks a lot.
regards
Maybe you are looking for
-
Ipod touch 4th generation is stuck in "plug into itunes" icon.
I have plugged it in to different cords and ports on computer as well as in the wall with same results. I have try to restore it with the computer saying it is not connected. It will not let me turn off or on but does turn off and on when plugging it
-
2007 US Daylight Savings change time zone support in JDK 1.3
Sun has supported the 2007 US Daylight Savings change in recent versions of its JDKs 1.4 and 1.5. It has made partial Daylight Savings changes to JDK 1.3, but it's not a complete implementation like in 1.4 and 1.5, because there is no support for pas
-
I purchased the teacher and student lightroom 5. I put in my code and uploaded my evidence to show that I am a teacher. I am guessing it was sent through to someone. How do I know when I can use the software? I don't understand what to do next.
-
Bluetooth Not Avaiable on my Macbook Pro Mid 2010, Why?
Dear Forum, I've a Macbook Pro (i5 Model, Mid 2010, Out of guarantee....) since September I've this problem: "Bluetooth Not Available" I've tried a PRAM reset (Cmd-Alt-P-R and Power) but nothing changes I've tried a permission-repairing but nothing c
-
Hi, I have a query- What is MySap- Meant for, Use ? Kindly Explain. Thanks & Regards, Geeta Chaudhary.