Using Data Import Handler
This question was posted in response to the following article: http://help.adobe.com/en_US/ColdFusion/10.0/Developing/WSe61e35da8d318518-1acb57941353e8b4 f85-7ffe.html
This page should mention that the Data Import Handler is not available in ColdFusion 10 Standard (so available only in Enterprise, Trial, or Developer editions.)
Similar Messages
-
Data Import defaulting Null date to system date
I am trying to import Campaigns using Data Import, and when End Date is NULL it defaults to system date. Is there any way to avoid this default? Even when I don't "map" the End Date, it still defaults it.
Thanks!This file will always populate itself what you have to try to do is establish what is a good date to put in this section. What i have done is created a System default to be today() + 14 this will mean that the end date will be roughly 14 days after it was created. This is then up to the person running the campaign to change if required.
-
How to move a selected row data from one grid to another grid using button click handler in flex4
hi friends,
i am doing flex4 mxml web application,
i am struck in this concept please help some one.
i am using two seperated forms and each form having one data grid.
In first datagrid i am having 5 rows and one button(outside the data grid with lable MOVE). when i am click a row from the datagrid and click the MOVE button means that row should disable from the present datagrid and that row will go and visible in the second datagrid.
i dont want drag and drop method, i want this process only using button click handler.
how to do this?
any suggession or snippet code are welcome.
Thanks,
B.venkatesan.Hi,
You can get an idea from foolowing code and also from the link which i am providing.
Code:
<?xml version="1.0" encoding="utf-8"?>
<mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute"
width="613" height="502" viewSourceURL="../files/DataGridExampleCinco.mxml">
<mx:Script>
<![CDATA[
import mx.collections.ArrayCollection;
import mx.binding.utils.BindingUtils;
[Bindable]
private var allGames:ArrayCollection;
[Bindable]
private var selectedGames:ArrayCollection;
private function initDGAllGames():void
allGames = new ArrayCollection();
allGames.addItem({name: "World of Warcraft",
creator: "Blizzard", publisher: "Blizzard"});
allGames.addItem({name: "Halo",
creator: "Bungie", publisher: "Microsoft"});
allGames.addItem({name: "Gears of War",
creator: "Epic", publisher: "Microsoft"});
allGames.addItem({name: "City of Heroes",
creator: "Cryptic Studios", publisher: "NCSoft"});
allGames.addItem({name: "Doom",
creator: "id Software", publisher: "id Software"});
protected function button1_clickHandler(event:MouseEvent):void
BindingUtils.bindProperty(dgSelectedGames,"dataProvider" ,dgAllGames ,"selectedItems");
]]>
</mx:Script>
<mx:Label x="11" y="67" text="All our data"/>
<mx:Label x="10" y="353" text="Selected Data"/>
<mx:Form x="144" y="10" height="277">
<mx:DataGrid id="dgAllGames" width="417" height="173"
creationComplete="{initDGAllGames()}" dataProvider="{allGames}" editable="false">
<mx:columns>
<mx:DataGridColumn headerText="Game Name" dataField="name" width="115"/>
<mx:DataGridColumn headerText="Creator" dataField="creator"/>
<mx:DataGridColumn headerText="Publisher" dataField="publisher"/>
</mx:columns>
</mx:DataGrid>
<mx:FormItem label="Label">
<mx:Button label="Move" click="button1_clickHandler(event)"/>
</mx:FormItem>
</mx:Form>
<mx:Form x="120" y="333">
<mx:DataGrid id="dgSelectedGames" width="417" height="110" >
<mx:columns>
<mx:DataGridColumn headerText="Game Name" dataField="name" width="115"/>
<mx:DataGridColumn headerText="Creator" dataField="creator"/>
<mx:DataGridColumn headerText="Publisher" dataField="publisher"/>
</mx:columns>
</mx:DataGrid>
</mx:Form>
</mx:Application>
Link:
http://social.msdn.microsoft.com/Forums/en-US/winformsdatacontrols/thread/ae9bee8d-e2ac-43 c5-9b6d-c799d4abb2a3/
Thanks and Regards,
Vibhuti Gosavi | [email protected] | www.infocepts.com -
Issue found in EHS in using specification data import using process
Dear EHS community
Now using EHS classic for a long time a issue has been detected in EHS standard import. During maintenance of EHS data normally using CG02 the system is always using the default "Data origin" specified in customizing to be stored in EHS tables (e.g. ESTRH, ESTRI etc.). In standard process to import specification data one can define a different "Data origin". Now we are using an import file with default" data origin and executed the import. Now a strange effect has been detected (and not always) for update of identifiers. For the import purpose you must nominate at least one identifier. If the identifier is found then normally no update happens but only the value assignment data is inserted (or updated). If the identifier is not found it get be inserted on spec level in ESTRI. Now during the update the "Data origin" of the identifier present in the system (and which matched to identifier on file level) was changed but not the identifier as such. Any data record on value assignment level received the defautl data origin. Actually there is no explanation for this behaviour. If Default "Data origin" would be "SAP" (as the term) this value has been change to "Space". Any explanation of this effect is appreciated (or and idea regarding that).
C.B.
PS: analysis of change logs in EHS etc. executed so far clearly indictae that an "Update" happened on the identifier; but only the field SRSID is effected; EHS import is quite old and therefore very stable;
PPS: I found a thread taking about the import file:
spec import_inheritance data
Example shown thre is like:
+BS
+BV $ESTRH
SRSID EH&S
SUBID 000000385000
SUBCAT REAL_SUB
AUTHGRP EHS_PS
+EV
+BV $ESTRI
SRSID EH&S
IDTYPE NAM
IDCAT EHS_OLD
IDENT XY0002
ORD 0001
+EV
+BV SAP_EHS_1013_001
$ESTVA-SRSID EH&S
SAP_EHS_1013_001_VALUE N09.00101280
+EV
If you compare SAP helpt normally only at the"begining pof the file you will find "SRSID" Here this field is nominated often. On Level of ESTRH as well as ESTRI.
PPS: e.g. refer to: TCG56 EHS: Data Origin - SAP Table - ABAPDear Ralph
first thanks for feedback. Regarding content provider: we need to check that on deeper level
Regarding import may be my explanation was not good enough.
Imagine this case:
you have a specification in the system there you would like to add e.g. denisity data.. To do so you need at least one identifier which you must nominate during the import. As long as this identifier is "identical" in system and in the file this identifier should not be "changed/effected" etc. and only the additional data should be loaded, This was the process we used. Now we detected that this seems not to be the real effect. The identifier as part of the file is "updated" in EH&S. In the example above somebody used this logic:
+BV $ESTRI
SRSID EH&S
IDTYPE NAM
IDCAT EHS_OLD
IDENT XY0002
ORD 0001
+EV
Nearly the same is used in our process. Only difference ist, that we do not define the "SRSID". The same is true for any other data in the file. SRSID is nether specified.
What is happing now:
e.g. the "density" data is added with SRSID "EH&S". This effect is "normal". As by default SRSID should be EH&S (as this is defined as such in customizing) and because of the fact that at the top the ID is es well "EH&S". In the system we have the "XY0002" having SRSID EH&S. By using now this upload approach the only difference afterwards is tht kin th systeM; XY0002 does get "blank" as SRSID (and there is no data origin "blank" defined. Up to today my unertsanding was clearly: no update should happen in the identifier. This seems not to be the case. Is my understanding here different? Or is this SRSID in the load file is really mandatory in ESTRI level to avoid this effect. I hope that you can provide some feedback regarding this.
C.B.
PS: referring to: Example: Transfer File for Specifications - Basic Data and Tools (EHS-BD) - SAP Library
The header of import file should look like:
Comment
+C
Administrative section
Character standard
+SC
ISO-R/3
Identification (database name)
+ID
IUCLID
Format version
+V
2.21
Export date
+D
19960304
Key date for export
+VD
19960304
Set languages for export
+SL
E
Date format
+DF
DD.MM.YYYY
IN our case +ID = EH&S (as this is the value in export file)
IN this example this additional one is shown:
Begin table
+BV
$ESTRI
Table field
IDTYPE
NAM
Table field
IDCAT
IUPAC
Table field
IDENT
anisole
Table field
LANGU
E
Table field
OWNID
ID1
Therefore no SRSID is specified. And this is the data in our file (on high level) and the "only" change is that the identifer get "deleted" the SRSID -
Use of ODI in Data Import scenario
Hi,
We are contemplating to use ODI for a Data Import scenario. The flow goes something like this:
1.A user specified flat file or xml with a mapping between the file columns and base table columns serves as the input.
(Note that the file column names could be anything and that is why we use mapping).
2. This file data is stored in a stager table and certain validations are run on this data.
3. This data is then imported into the base tables.
I assume we cannot use ODI for step 1 as the file columns are not fixed and can vary. We need to programmatically interpret the data from the mappings. (Is there a way to do this in ODI?).
If we use ODI for step 3 to import data from stager to base tables:
- If we have a million records to be imported, then how performant is ODI? Do we need to invoke ODI in batches of (a few thousands) to improve performance?
- Thanks in Advance,
RaghuHi Jont,
Thanks for your reply..
Here is an example of the mapping that we use:
Flat File columns:
AccName
AccLoc
Mapping (Specified by the user at run time):
AccName -->(Maps to) Account.Name
AccLoc --> (Maps to) Account.Location
The user would map the file columns to the final target entity fields (like Account.Name) as above.
Since, we have to store this data in a intermediate staging table, we would have a mapping internally (which is fixed), like
Account.Name -->(maps to) AccStager.Name
Account.Location -->(Maps to) AccStager.Location
where AccStager.Name is the staging table field name.
Thus, by using these two sets of mapping, we store the file data into staging table.
Hope this is clear...
Thanks,
Raghuveer -
Data import automation using webservices
Hi,
We have a requirement to load data into Oracle On demand monthly from an external application. We want to automate the task using webservices.
Did anyone of you work on such automation? If yes, request you please provide me the details of the same.
Thanks in advance.
regards,
mHi ,
Automated Data Import / Export to or from Oracle SOD using Web-Service is a very much a common requirement these days and we at CRMIT should be able to do that for you.
Drop me your requirement to [email protected] so that we can help you take it forward.
Regards,
Deepak H Andeli -
Hi everyone!
My client want to use the import package from Data Manager to upload his data.
I´m looking for a How to... than explain How to I can configure it.
Somebody can Help Me??
Thanks!!!Hi,
I'm going to try this last one tomorrow.
Now I have this little problem: If my datafile have a header with a different order, I have a warning and reject the header line.
I introduced a mapping of the cols. My TransformationFile is like this but I have the same problem.
Some suggest??
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT= -1
ROUNDAMOUNT=
*MAPPING
CATEGORY=*COL(1)
ACCOUNT=*COL(2)
CECO=*COL(3)
CLIENTES=*COL(4)
CURRENCY=*COL(5)
DATASRC=*COL(6)
ENTITY=*COL(7)
INTCO=*COL(8)
ORGVENTAS=*COL(9)
PRODUCTO=*COL(10)
TIME=*COL(11)
SIGNEDDATA=*COL(12)
*CONVERSION
Thanks!
Regards -
Using export/import to migrate data from 8i to 9i
We are trying to migrate all data from 8i database to 9i database. We plan to migrate the data using export/import utility so that we can have the current 8i database intact. And also the 8i and 9i database will reside on the same machine. Our 8i database size is around 300GB.
We plan to follow below steps :
Export data from 8i
Install 9i
Create tablespaces
Create schema and tables
create user (user used for exporting data)
Import data in 9i
Please let me know if below par file is correct for the export :
BUFFER=560000
COMPRESS=y
CONSISTENT=y
CONSTRAINTS=y
DIRECT=y
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE
TRIGGERS=y
TTS_FULL_CHECK=TRUE
Thanks,
Vinod BhansaliI recommend you to change some parameters and remove
others:
BUFFER=560000
COMPRESS=y -- This will increase better storage
structure ( It is good )
CONSISTENT=y
CONSTRAINTS=y
DIRECT=n -- if you set that parameter in yes you
can have problems with some objects
FEEDBACK=1000
FILE=dat1.dmp, dat2.dmp, dat3.dmp (more filenames here)
FILESIZE=2048GB
FULL=y
GRANTS=y -- this value is the default ( It is
not necesary )
INDEXES=y
LOG=export.log
OBJECT_CONSISTENT=y -- ( start the database in restrict
mode and do not set this param )
PARFILE=exp.par
ROWS=y
STATISTICS=ESTIMATE -- this value is the default ( It is
not necesary )
TRIGGERS=y -- this value is the default ( It is
not necesary )
TTS_FULL_CHECK=TRUE
you can see what parameters are not needed if you apply
this command:
[oracle@ozawa oracle]$ exp help=y
Export: Release 9.2.0.1.0 - Production on Sun Dec 28 16:37:37 2003
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
You can let Export prompt you for parameters by entering the EXP
command followed by your username/password:
Example: EXP SCOTT/TIGER
Or, you can control how Export runs by entering the EXP command followed
by various arguments. To specify parameters, you use keywords:
Format: EXP KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: EXP SCOTT/TIGER GRANTS=Y TABLES=(EMP,DEPT,MGR)
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
USERID must be the first parameter on the command line.
Keyword Description (Default) Keyword Description (Default)
USERID username/password FULL export entire file (N)
BUFFER size of data buffer OWNER list of owner usernames
FILE output files (EXPDAT.DMP) TABLES list of table names
COMPRESS import into one extent (Y) RECORDLENGTH length of IO record
GRANTS export grants (Y) INCTYPE incremental export type
INDEXES export indexes (Y) RECORD track incr. export (Y)
DIRECT direct path (N) TRIGGERS export triggers (Y)
LOG log file of screen output STATISTICS analyze objects (ESTIMATE)
ROWS export data rows (Y) PARFILE parameter filename
CONSISTENT cross-table consistency(N) CONSTRAINTS export constraints (Y)
OBJECT_CONSISTENT transaction set to read only during object export (N)
FEEDBACK display progress every x rows (0)
FILESIZE maximum size of each dump file
FLASHBACK_SCN SCN used to set session snapshot back to
FLASHBACK_TIME time used to get the SCN closest to the specified time
QUERY select clause used to export a subset of a table
RESUMABLE suspend when a space related error is encountered(N)
RESUMABLE_NAME text string used to identify resumable statement
RESUMABLE_TIMEOUT wait time for RESUMABLE
TTS_FULL_CHECK perform full or partial dependency check for TTS
VOLSIZE number of bytes to write to each tape volume
TABLESPACES list of tablespaces to export
TRANSPORT_TABLESPACE export transportable tablespace metadata (N)
TEMPLATE template name which invokes iAS mode export
Export terminated successfully without warnings.
[oracle@ozawa oracle]$
Joel P�rez -
Select table when import using Data Pump API
Hi,
Sorry for the trivial question, I export the data using Data Pump API, with "TABLE" mode.
So all tables will be exported in one .dmp file.
My question is, then how to import few tables only using Data Pump API?, how to define "TABLES" property like command line interface?
should I use DATA_FILTER procedures?, if yes how to do that?
Really thanks in advance
Regards,
KahlilHi,
You should be using metadata_filter procedure for the same.
eg:
dbms_datapump.metadata_filter
(handle1
,'NAME_EXPR'
,'IN (''TABLE1'', '"TABLE2'')'
{code}
Regards
Anurag -
Issues importing/"using" date/datetime in Julia. Is it even present?
I'm trying use the Date or DateTime in a script, which requires that I user "import Dates" or "using Dates". However, it can't find the module. I checked the the file list on the package info list and it doesn't seem to be present. Could someone confirm that this is the case?
[edit] I can confirm that I was doing everything correctly. I switch to julia-git in AUR, and its working fine after I added the path ( push!(LOAD_PATH,"/usr/share/julia/base") ). I'd still like it if someone could confirm that I'm not just looking in the wrong place.
Last edited by nstgc (2015-02-09 00:58:01)Sorry for the late reply. No, this module is still not included in the installation as far as I can tell as of today. Do know that today I merely checked "/usr/share/julia/base", however last time I checked it was no where to be seen. I also tried "julia> import Dates".
Also /usr/share/julia/base is not in the load path.
julia> LOAD_PATH
2-element Array{Union(ASCIIString,UTF8String),1}:
"/usr/local/share/julia/site/v0.3"
"/usr/share/julia/site/v0.3"
I don't know if this is a bug or if this version just doesn't include this package. -
Using the Import Test Data Wizard
Using Oracle HTMLDB 1.6.0.00.87
Whenever we try to use the import test data wizard, even using a simple text file such as: "Forname","Surname"
Joe,Bloggs
the file does not import correctly. We are trying to import to a new table and uploading a txt file with the above content. What we get in the set table properties is something like: Column Names : rom_wwv_flow_file_objects
Data Type:VARCHAR2
Format:
Column Length: 30
Upload: Yes
Row 1: Where n
Any ideas? We tried the same test at the UK Oracle user group conference with success. Is there a set up problem on our server?
Cheers
TyProblem solved.
Ensure the correct character is set when importing. -
Can data pump handle export/import of a sequence explicitly?
Is it possible to just export/import a database sequence explicitly using data pump.
I know we can export/import sequence by exporting/importing the user in which it is defined.
However I am looking for a way to export/import individual sequnces without having to export/import the entire user.
ThanksHi, you can use the INCLUDE clause into expdp command, with this option you can specified the object for the export and import process. You can review the next documentation for more information about this topic.
<br><br>
INCLUDE
<br><br>
Regards. -
Simple Storage unit management by using of 'real' handling unit data
Hi all,
actually we have a WM setup to use the full handling unit management in Warehouse. After some years we detected that the HU managed
Warehouse is not the best solution for our business.
Inside of WM we do no more want to have the Handling units. The assigned storage location is not HU managed.
Now to my question:
We receive from other plants finished goods via inbound delivery. All goods are packed on Handling Units.
When doing the good receipt, Is it possible to use the HU data (vekp, vepo) and create according to it the WM storage units? So one Handling unit
from the inbound delivery will become one storage unit.
thanks in advance for your answers.
MichaelHi Patrick,
we actually have SAP WM with HU management running. but we want to go back to WM without HU management - only with the storage unit management.
when getting an inbound delivery from another plant (in the same client) goods are batch managed and packed on HU´s. i am searching for a possibility to use the HU data from the inbound delivery to create the WM storage units. So one HU should become one WM storage unit.
so i will post to an ERP comuity.
thanks
Michael -
Hindi font data import by using datapump
Dear All,
I have a dmp file in which there are some data in hindi font (mangal font) in some tables.
I am importing the dmp file by using the datapump in some other system ,the hindi font data is showing like '???????' format.
So please suggest me that how i will import the dmp file so that i willl get my data in proper format.Characterset and Compatibility Issues
Unlike previous tools, the Data Pump uses the characterset of the source database to ensure proper conversion to the target database. There can still be issues with character loss in the conversion process.
Note 227332.1 NLS considerations in Import/Export - Frequently Asked Questions
Note 457526.1 Possible data corruption with Impdp if importing NLS_CHARACTERSET is different from exporting NLS_CHARACTERSET
Note 436240.1 ORA-2375 ORA-12899 ORA-2372 Errors While Datapump Import Done From Single Byte Characterset to Multi Byte Characterset Database
Note 553337.1 Export/Import DataPump Parameter VERSION - Compatibility of Data Pump Between Different Oracle Versions [Video]
Note 864582.1 Examples using Data pump VERSION parameter and its relationship to database compatible parameter -
Importing incoming payments using data transfer workbench
Hi experts
I am using data transfer workbench for incoming payments, but incoming receivables are checks.
I am using three templete files
1.Payments
Recordkey Cardcode Docdate
1 A0001 21102009
2.Payments_check
Recordkey Lineno Checkaccount Checknumber Checksum Duedate
1 0 161016 1234 11500 21102009
3.Payment Invoice
Recordkey Lineno Docentry SumApplied
1 0 616000012 11500
In the field docentry i am placing the invoice number ,i did tried placing the payment number as well in both the cases i am getting the same error. Invalid document number for rct2.docentry Application defined or object defined error 65171
What should i do to resolve this error.
Thanks and regards
RohanHi Rohan,
Document number is usually not the same as your DocEntry. Run a simple query to find it out:
SELECT T0.DocEntry FROM OINV T0 WHERE T0.DOcNum = '[%0\]'
Also the Date format should be YYYYMMDD.
Thanks,
Gordon
Maybe you are looking for
-
hi Friends, I have created new tax code V1 :15% input tax. condition recored is maintained for that. When I do not enter plant in invoice tab it shows tax details.but i cannot save po without plant. But when i enter plant ,invoice tab is not working.
-
When adding photos to my .mac webpage using iWeb, I do not like the way you can click on each picture, causing it to open into another screen. I only want my pictures to be viewed as the thumbnails in the photo page and/or in a slideshow. Is there an
-
Hey all Well I was just wonder if you guys could give me some insight? I am using the entity manager and It seems like I can do a transaction.. But as I keep doing the transaction to add elements.. It runs fine but then all of a sudden it just hangs?
-
Hi, I'm using the example code of "TDMS-ContAcq-IntClk.c" from the "DAQmx ANSI C" My question is how can I read and analyze the data on realtime and on offline. any example code will help me alot, thanks! My setup information: Hardware: NI 9234 NI-Da
-
Transfer of the account --- I lost my e-mail and Internet connection
I wanted to transfer our residential DSL account that was under my husband's name to me. They told me that I could keep my old Verizone-mail account, but when they did the transfer, my e-mail account got disconnected. Moreover, presumably since I hav