CF 8 exporting or importing data sources
Instead of manually configuring 40 data sources in the
administrator, is there a way to import the data sources from
another server that already has the data sources configured? We've
done a new install of CF 8 on a new server box, and we have a
previous version of CF on another server. On that box there are 40
data sources already set up and running. Is there a way to move all
these data sources at once? Or do we have to set them up one-by-one
on the new box? Thank you for your suggestions.
sbudlong wrote:
> Is the answer to use Packaging & Deployment in CF 8
Administrator? Let's assume
> the old version is MX or 7 (I don't know which version,
offhand, right now).
> Can we create an archive file that we can bring into CF
8 and recreate all the
> data sources from that older version of CF?
>
That is what the House of Fusion link suggested, and
apparently worked
for the poster of that thread. I have never done that cross
version,
but don't have a reason why it would not work.
You do need an MX version of ColdFusion as the Packaging
& Deployment
AKA CAR AKA ColdFusion Archive method was introduced in MX
with version 6.0.
If you are importing from an older version such as 5 or 4.5,
then you
are going to have more work to do. I am not sure if this is
easier then
just hand entering the 40 DSNs, but if you could 'upgrade' on
of these
pre MX versions to current version, it would take care of
translating
the older DSN's to the new neo-???.xml file(s). You could
then copy
these files to the new server.
Similar Messages
-
Best way to export and import data from an iPhone app?
How do people here prefer to export and import data from an iPhone app?
For example, suppose your app creates or modifies a plist. It might be nice to provide that plist to the user for separate backup and later allow them to re-import it.
How do people do that? Email the plist to an address? If so, how would one later get it back in?
Is there a common way people do this kind of thing?
Thanks,
dougOr maybe the best approach for this is some sort of file synchronization feature? Is there a way of having files from the app's filesystem automatically get stored on the computer's filesystem when synchronizing and vice-versa?
doug -
Language Problem while exporting and importing data
hi,
I have Oracle version 8.1.7.0.0 installed on one server and 9.2.0.1.0 installed on new server.
I'm copying and pasting my version info from SQL*Plus:
SQL*Plus: Release 8.1.7.0.0 - Production on Mon Aug 22 10:46:31 2005
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Connected to:
Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
With the Partitioning option
JServer Release 8.1.7.0.0 - Production
SQL>
SQL*Plus: Release 9.2.0.1.0 - Production on Mon Aug 22 12:30:06 2005
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.1.0 - Production
SQL>
I created new user on my new server from enterprise manager.
Exported user from the old server and imported in the new server.
i.e: from Oracle8i Enterprise Edition Release 8.1.7.0.0, I did
c:\>exp system/manager file=abc.dmp owner=abc
Then on the new server Release 9.2.0.1.0, I did
c:\>imp system/manager file=abc.dmp fromuser=abc touser=abc
I'm using Arabic Language on my both servers. NLS_LANG parameter on both the servers is AMERICAN_AMERICA.WE8MSWIN1252.
On both the servers I'm able to insert and select data in arabic.
However, after I export the data from old server to the new server, the arabic data comes in question marks.
If I create new table and insert arabic data on new server's user abc it is displaying well. Only the data which I exported and imported is not showing arabic.
On both old and new servers operating system is Windows XP.
I'm stuck with this problem. Anybody having any idea about how to solve this problem please help.
Thank you all in advance.
RegardsLet me be clear here. Storing Arabic data in a WE8MSWIN1252 database is not supported by Oracle and will lead to problems. You are incorrectly using the NLS_LANG to prevent proper conversion and your data appears to be okay when you use utilties like SQL*PLUS to view your data. When you write applications that don't rely on the NLS_LANG like JDBC thin driver for instance you will realize your data is in fact invalid. To learn more about the NLS_LANG you can take a look at this FAQ: http://www.oracle.com/technology/tech/globalization/htdocs/nls_lang%20faq.htm
To migrate your database to a proper character set you can refer to this paper:
http://www.oracle.com/technology/tech/globalization/pdf/mwp.pdf
But please do not ask for help in supporting your current configuration in this forum. -
SM37 - "pause" V3-update job while importing data source and extract struct
Hi
Insted of deleting an hourly schedulated V3-update job and the recreating it (after a import of a extract structure and a data source) I came to think about whether it is possible to, kind of, pause the job that I otherwise would delete and recreate.
Is this possible some how?
My thought:
1. Find the job in released status.
2. Select the job and change it (top menu Job-->change)
3. Change the scheduling to some day in the future
4. Change job status from released to scheduled
5. Import the extract structure and data source.
6. Change the schedule job back to hourly run.
7. Now the job should run on hourly basis again....
By the way: what do I import first, the data source or the extract structure?
Thanks in advance and kind regards,
TorbenWe did delete and recreate...
-
How to export and import the source code
Hi,
I want to ensure the dev databse and prod databse same
I dont mind about the data.
I want all the objects ( packages,procedures,functions ) to be exported and imported.
Please help me out in this
Regards,
RaviI wrote a long message but lost everything while trying to post :(
You haven't mentioned about tables. If there are some changes in the tables structure, you need to take care of that also as in the new database if you don't change the tables all the dependent procedures...functions...packages would go invalid.
Amardeep Sidhu
http://www.amardeepsidhu.com -
Guide me for exporting and importing data from oracle 10g to oracle 9i
I have been trying export and import options in oracle but m not very clear about the procedure..
Also when i am logging in as Scott/tiger i am facing an error
ORA 1542 - tablespace 'USER' is offline and cannot allocate space in it.how to resolve itHello,
I see two questions.
ORA 1542 - tablespace 'USER' is offline and cannot allocate space in it.how to resolve it About the error ORA-01542 you have to bring the Tablespace USERS online:
http://www.error-code.org.uk/view.asp?e=ORACLE-ORA-01542
If you cannot bring it online you may have to Recover it first.
I have been trying export and import options in oracle but m not very clear about the procedure..About the Export in 10g and Import in 9i (9.2 or 9.0.1 ?), you have to export with the Export utility of the Target Database (the lowest Database version here: 9.0.1 or 9.2) then, you have to Import to the Target Database using its Import utility (in 9.0.1 or 9.2).
The following Note of MOS may guide you:
Compatibility Matrix for Export And Import Between Different Oracle Versions [Video] [ID 132904.1]
NB: You cannot use here the DATAPUMP.
Hope this help.
Best regards,
Jean-Valentin -
Export and Import data or Matadata only
Aloha,
I need to update an instance, update data only with a full dump. What parameter should i use on import. How can i drop the schema without touching the metadata and database objects?
Thanks in advance.
HadesTheHades0210 wrote:
Hi,
Yes, i will import a full dump export but data only that i need to import and existing db_objects and metadata on the instance should not be touch.
Regards,
Hadeseasy as pie
[oracle@localhost ~]$ impdp help=yes
Import: Release 11.2.0.2.0 - Production on Mon Feb 4 19:57:10 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the first parameter on the command line.
The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.
For example, ATTACH=job_name.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
CONTENT
Specifies data to load.
Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keywords are: SKIP_CONSTRAINT_ERRORS.
DIRECTORY
Directory object to be used for dump, log and SQL files.
DUMPFILE
List of dump files to import from [expdat.dmp].
For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD
Password key for accessing encrypted data within a dump file.
Not valid for network import jobs.
ESTIMATE
Calculate job estimates.
Valid keywords are: [BLOCKS] and STATISTICS.
EXCLUDE
Exclude specific object types.
For example, EXCLUDE=SCHEMA:"='HR'".
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIME
Time used to find the closest corresponding SCN value.
FULL
Import everything from source [Y].
HELP
Display help messages [N].
INCLUDE
Include specific object types.
For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name [import.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
PARTITION_OPTIONS
Specify how partitions should be transformed.
Valid keywords are: DEPARTITION, MERGE and [NONE].
QUERY
Predicate clause used to import a subset of a table.
For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.
For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine data file references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLE
Table names are remapped to another table.
For example, REMAP_TABLE=HR.EMPLOYEES:EMPS.
REMAP_TABLESPACE
Tablespace objects are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N].
SCHEMAS
List of schemas to import.
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC resources.
SKIP_UNUSABLE_INDEXES
Skip indexes that were set to the Index Unusable state.
SOURCE_EDITION
Edition to be used for extracting metadata.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STREAMS_CONFIGURATION
Enable the loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLES
Identifies a list of tables to import.
For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to import.
TARGET_EDITION
Edition to be used for loading metadata.
TRANSFORM
Metadata transform to apply to applicable objects.
Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].
Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES
List of data files to be imported by transportable mode.
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to import.
Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.
The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
CONTINUE_CLIENT
Return to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
START_JOB
Start or resume current job.
Valid keywords are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keywords are: IMMEDIATE. -
Exporting and Importing Data of Application Log
Hello,
We do migrate an application from SAP System A to SAP System B.
Despite the application logic, we do need to migrate some data, too.
For this we do write an export report in System A that writes the data of some tables to files on the app server..
Another report in system B takes the files and fills the respecting tables.
So far so good:
One table (which reports certain actions) contains the handle to the application log, which was created for this action.
I do want to extract this application log and later import this log in the other system, but do not know how this can be done in the best way.
Do you have any suggestion ?
Best regards
MichaelIn my experience, it's an unrealistic expectation.
SAP doesn't provide a straight path to migrate logs the way you plan to do. It's kept in a combination of tables and index files on the system. So even if you were to successfully extract it, when you load the data into system B you will lose the date and time stamp. When you create a log entry, it picks up system date and time. That's what an auditor would look for.
An alternate solution would be to read the application log and store it in a new custom table in system B.
Any reporting should be done off that as of a specific date. -
Exporting and importing data from memory
Hi Gurus,
Written 2 programs.
1st program executes and submits 2nd program in background.
Now the output in internal table which we get in 2nd program has to be displayed in 1st program.
My taught is to use Set ID and GET ID to retrieve the data.
How can we achive this?Hi,
You can pass the internal table by using
export to memory id
and import that internal table in the called program.
Use SUBMIT ....AND RETURN. -
Export and Import data in APEX
I would like to know how to export the data from a table (6000 rows) and then I would like to import the data to another table. So I need to know how to do that process.. it's like to take a picture of the data in the old table and put them into the new table which it has new data.... so I can't just copy and paste...
Thanks for your help... regardsDates do not have formats in the database(they are NOT saved as strings in the DB), but when you display them as string or export them then it has a format(coz its just a bunch of characters).
The problem that you are facing is that the _date string in the csv file could not be converted by the database automatically into a date with its current date format settings(of the session).
You need to set the date format of the session before you do the csv file import to the date format of the any date column strings in the file OR better match it with the date format of the client with which you exported the data.
Set the same Date format before exporting as well as before importing.
If its SQL Developer, you should be able to set it in Preferences(think its shown in that demo video)
Do the same before importing too.
Make sure that the datafile date column strings match the format of the importing session.
If you want to pro grammatically set the date format use something like
ALTER SESSION SET NLS_DATE_FORMAT = 'DD-MON-YY HH24.MI.SS';By the way, I think you should have asked this question in the SQL PLSQL forum, since it isn't particularly related to the Apex.
Pls update your forum handle to something more friendlier than "user13344939" -
hi all,
i have a req't tht i need to update the data frm one prog periodically ,..som where,
when data req i run anothr prog and get tht data frm memory and display it.
how can i do this.
two prog's run in diffrnt times.
1st prog
EXPORT itab to memory ID 'XX' .
2nd prog
ans IMPORT itab from MEMORY ID 'XX'
these are useful whn both prog's run same time , i.e 2nd prog calld frm 1st one.
but i wnt 2nd prog run separatly.
how can i do this.. any other alternatives
Thanks in Advance
Regards,
Chanduthank's for u'r replay .
i have huge amnt of data ..
is thr any other way thn applicationaserer write and thn read...???
i think it's possible through SHARED Memory concept.. using this we can use globle memory
but i don't know how to do tht...
any help..........
if any one knew tht plz help me ..
Regards,
Chandu -
Export and import data to an external accounting software
I have to make a link between SAGE accounting and B1 for a frnch customer.
I want to export sales and prurchase invoices to SAGE
and import customer and supplier payments.
I would like to know what would be the consequence in B1.
Ould it be "transparent" for the users ?
For example do I have to make the Year End Closing ?
Is there other things that I have to do except make the setiings as if I use the B1 accounting
Thanks for your help
Christophe FLAMENTHi,
What happens if we don't use accounting functionalities ?
Every sales, purchase and inventory transactions that are executed in Business One automatically generate an accounting document. So, there is will be no impact as such on the system.
Is there any probleme acting like this ? And what kind of problem ?
In the very first place, i would never suggest using of any other ERP when SAP B1 is implemented. As SAP B1 itself is very robost and can handle almost all the requirements of SMB.
Also, I am really not sure about the functionality of SAGE accounting software.
In general, we dont suggest any customer using multiple systems, as it not only causes dulipcation of work, sometime it may also result in not recording of some of the data. As a result there is every chance that neither systems will have complete set of information.
Do we have to do something in the accounting ?
Depends on what kind of transactions you want process thru B1.
In case if you're processing Sales, Purchase and inventory related transaction, you have to process payments and receipts related to the same. also, ensure proper monthly (if necessary) & yearly closing.
Assign points if useful
Thanks,
Srikanth -
I have two tables, one is a recreation (and rename) of the other. What is best/easiest way to transfer the data over to the new table? I am thinking 'insert into <new table> ( select * from <old table> [ where clause] )
Do I need to build a loop?It always depends of your data but maybe this sollutions will comes handy for you:
First solution:
To insert rows use:
[email protected]> insert /*+ APPEND */ into t10 select * from hr.employees nologging
2 /
107 rows created.
Elapsed: 00:00:00.01
When you need to populate table with new data:
truncate table t10;
Then insert all rows once more:
[email protected]> insert /*+ APPEND */ into t10 select * from hr.employees nologging
2 /
Second solution:
Elapsed: 00:00:00.03
[email protected]> insert into t10 select * from hr.employees where rownum < 50
2 /
49 rows created.
Elapsed: 00:00:00.01
[email protected]> insert into t10 select * from hr.employees where employee_id not in (select employee_id from t10)
2 /
58 rows created.
There just two very simple one, for sure not optimal, but you did not specify to much details.
I hope this will help you somehow
Best Regards
Krystian Zieja / mob -
How to create the Export Data and Import Data using flat file interface
Hi,
Request to let me know based on the requirement below on how to export and import data using flat file interface.....
Please provide the steps involved for the same.......
BW/BI - Recovery Process for SNP data.
For each SNP InfoProvider,
create:
1) Export Data:
1.a) Create an export data source, InfoPackage, comm structure, etc. necessary to create an ASCII fixed length flat file on the XI
ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider.
1.b) All fields in each InfoProvider should be exported and included in the flat file.
1.c) A process chain should be created for each InfoProvider with a start event.
1.d) If the file exists on the target drive it should be overwritten.
1.e) The exported data file name should include the InfoProvider technical name.
1.f) Include APO Planning Version, Date of Planning Run, APO Location, Calendar Year/Month, Material and BW Plant as selection criteria.
2) Import Data:
2.a) Create a flat file source system InfoPackage, comm structure, etc. necessary to import ASCII fixed length flat files from the XI
ctnhsappdata\iface\SCPI063\Out folder for each SNP InfoProvider.
2.b) All fields for each InfoProvider should be mapped and imported from the flat file.
2.c) A process chain should be created for each InfoProvider with a start event.
2.d) The file should be archived in the
ctnhsappdata\iface\SCPI063\Archive directory. Each file name should have the date appended in YYYYMMDD format. Each file should be deleted from the \Out directory after it is archived.
Thanks in advance.
TysonHere's some info on working with plists:
http://developer.apple.com/documentation/Cocoa/Conceptual/PropertyLists/Introduc tion/chapter1_section1.html
They can be edited with any text editor. Xcode provides a graphical editor for them - make sure to use the .plist extension so Xcode will recognize it. -
Using an LDAP server as a data source?
I'm evaluating data services and one of our requirements is to be able to retrieve data from an LDAP server. This isn't for authentication.
We store information about users in an LDAP directory. The workflow I'm testing retrieves a customer number from a DB2 database and then retrieves the customer information in the LDAP directory.
Is there a way to do this without having to write a bunch of code? The "import metadata" menu doesn't list LDAP as one of the data providers.
thanks!There is no point-and-click (Import Data Source Metadata) way to use an LDAP server as a datasource. You have to use the Java Function provided on dev2dev. If you need help with it, please post here.
- Mike
Maybe you are looking for
-
I got the newest iPod and it has a messaging cube like on the iPhone and it will not let me send messages to my friends even with wifi. Please help me with my problem.
-
My son bought an iPod touch 3g off of ebay. The seller left the apps and music on there and we'd like to delete some of the music. But when I try to transfer purchases it says: "Some of the purchased items on the iPod could not be transferred to your
-
Applying a filter to an existing webi report without requerying
Hi, I'm trying to apply a filter to a webi report using the Report Engine Java SDK. Here's a snippet of my code: DataProviders dataProviders = doc.getDataProviders(); DataProvider dataProvider = dataProviders.getItem(0); Query queryObj =
-
Video tutorial - Photoshop Smart Objects in poster design
Hello Photoshop users! I had a bit of time spare last night and thought i'd make a quick video overview for a recent poster design I did. I created a quick 60's themed poster design for a battle of the bands event. The video is mainly about my workfl
-
Cold-Failover with Oracle Weblogic Server 10 R3
Hi ALL, It is possible to configure cold-failover using Oracle Weblogic Server?/ - 2 instances with the same ip, if one fails the second will run - Not comunication between nodes like in normal cluster Thanks