What is a Flat file adapter?
What is a Flat file adapter?
What is a Planning adapter?
What are all the adapters required to load the data from Excel to Planning application?
I agree with Gary in his previous post that users should do some effort to search before posting here. This forum is meant to post and answer only problems and difficulties developers face and that NOT normally covered in manuals or references, so this is not to define a keyword or explain a process that is covered in the documentation.
Please refer to your documentations, Google your question or use en.wikipedia.org before posting in this forum.
Thank you.
Similar Messages
-
Hello sap experts
Could you please explain me the concept of creating flat files ? what is the use and where do we use those?
thankyou very much
full points will be awarded
bjHi BJ,
A flat file is a plain text file which usually contains one record per line. Within such a record, the single fields can be separated by delimiters, e.g. commas, or have a fixed length. In the latter case, padding may be needed to achieve this length. Extra formatting may be needed to avoid delimiter collision. There are no structural relationships between the records.
It is use to upload or download data as on specified format.
Cheers !!!
Imzo -
FTP connection error whil,e using flat file adapter error
while using file adapter in the reciever end for proxy to file
i am giving paranters like server IP address and the port , i dont know which port to give by default it is giving me 21 , anyway how to check whether the connection is correct
and more over
Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException: Error when getting an FTP connection from connection pool: com.sap.aii.af.service.util.concurrent.ResourcePoolException: Unable to create new pooled resource: ConnectException: Connection timed out: connect
please help me in this thanksHI Sridhar ,
First check wether server started or not and then check you are connecting to FTP server by
go to run -> cmd and write ping and ipaddress which is used and see whether u r getting reponse from teh FTP server.
Try to login to the ftp server which you have mentioned in CC using the user name and pwd, to chk whether you have the permissions to login to the server.
Also the check whether the folder you are trying to access is having permission for delete/read/write.
Restart the FTP server and try it again.
Regards
Sridhar Goli -
Converting Idoc flat file representation to XML
Hi ,
I went through the guide for How To Convert Between IDoc and XML in XI 3.0. I'm concerned with the second part of the guide which says convert from falt file representation of Idoc to XML. Can anyone tell me what are the other design and configuration objects to be created for this scenario ( message types,interfaces, mapping , etc )
Also which step of the pipeline does the converted XML goes to ?
The program also expects a filename, what if I want to pass the file name dynamically ? Any ideas on this one.
Hope someone replies this time.........:)
Thanks for you help and improving my knowledge
Thanks
Advait Gode.Hi Advait,
Let me give you a small overview on how inbound IDOCs work before answering your question-
The control record is the key in identifying the routing of the IDOC. If you try to think IDOCs as normal mails(post), the control record is the envolope. It contains information like who the sender is and who the receiver should be and what the envelope contains (no different than receiving mails/letters by post).
Then the data records contain the actual data, in our example would be the actual letter. The status records contain the tracking information.
Traditionally SAP's IDOC interface (even before XI comes in picture) has utility programs to post incoming IDOCs in to SAP. One such program is RSEINB00 which basically takes the IDOC file name and the port as input. This program opens the file and posts the contents to the SAP IDOC interface (which is a set of function modules) via the port. The idea is to read the control record and determine the routing and further posting to application. Note that one information in the control record is the message type/idoc type which decides how the data records need to be parsed.
Now in XI scenario, what happens if we receive data as flat file? Normally, we use flat file adapter and in the file adapter we provide information on how to parse the file. But, if the incoming file is flat and in IDOC structure, why do we have to configure the file adapter, when the parsing capability is already available using RSEINB00/Standard IDOC interface.
This the reason, the guide suggests you to use RSEINB00. Now, your concern is what if you need to provide a dynamic filename. My idea is to write a wrapper program. This would be an ABAP program in your integration engine. This program will determine the file name (based on a logic which should be known to you) and then call program RSEINB00 using a SUBMIT/RETURN. You would then schedule this ABAP program in background to run in fixed schedules.
There are other ways of handling your scenario as well but from limited information from your request, I will stop with this now. Post me if you have any more queries.
KK -
hi,
I have tried with GoldenGate for Oracle/ non-Oracle databases. Now, I am trying for flat file.
What i have done so far:
1. I have downloaded Oracle "GoldenGate Application Adapters 11.1.1.0.0 for JMS and Flat File Media Pack"
2. Kept it on the same machine where Database and GG manager process exists. Port for GG mgr process 7809, flat file 7816
3. Following doc GG flat file administrators guide Page 9 --> configuration
4. Extract process on GG manager process_
edit params FFE711*
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
add extract FFE711, EXTTRAILSOURCE ./dirdat/oo*
add rmttrail ./dirdat/pp, extract FFE711, megabytes 20*
start extract FFE711*
view report ffe711*
Oracle GoldenGate Capture for Oracle
Version 11.1.1.1 OGGCORE_11.1.1_PLATFORMS_110421.2040
Windows (optimized), Oracle 11g on Apr 22 2011 03:28:23
Copyright (C) 1995, 2011, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:24:19
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 4628
Description:
** Running with the following parameters **
extract ffe711
userid ggs@bidb, password ********
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.77G
CACHESIZEMAX (strict force to disk): 1.57G
Database Version:
Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
PL/SQL Release 11.1.0.7.0 - Production
CORE 11.1.0.7.0 Production
TNS for 32-bit Windows: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - Production
Database Language and Character Set:
NLS_LANG environment variable specified has invalid format, default value will b
e used.
NLS_LANG environment variable not set, using default value AMERICAN_AMERICA.US7A
SCII.
NLS_LANGUAGE = "AMERICAN"
NLS_TERRITORY = "AMERICA"
NLS_CHARACTERSET = "AL32UTF8"
Warning: your NLS_LANG setting does not match database server language setting.
Please refer to user manual for more information.
2011-11-07 18:24:25 INFO OGG-01226 Socket buffer size set to 27985 (flush s
ize 27985).
2011-11-07 18:24:25 INFO OGG-01052 No recovery is required for target file
E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, at RBA 0 (file not opened).
2011-11-07 18:24:25 INFO OGG-01478 Output file E:\GoldenGate11gMediaPack\V2
6071-01\dirdat\ffremote is using format RELEASE 10.4/11.1.
** Run Time Messages **
5. on Flat file GGSCI prompt-->_*
edit params FFR711*
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\sample-dirprm\ffwriter.properties"
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
add extract ffr711, exttrailsource ./dirdat/pp*
start extract ffr711*
view report ffr711*
Oracle GoldenGate Capture
Version 11.1.1.0.0 Build 078
Windows (optimized), Generic on Jul 28 2010 19:05:07
Copyright (C) 1995, 2010, Oracle and/or its affiliates. All rights reserved.
Starting at 2011-11-07 18:21:31
Operating System Version:
Microsoft Windows XP Professional, on x86
Version 5.1 (Build 2600: Service Pack 2)
Process id: 5008
Description:
** Running with the following parameters **
extract ffr711
CUSEREXIT E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\flatfilewriter.dll CUSE
REXIT passthru includeupdatebefores, params "E:\GoldenGate11gMediaPack\GGFlatFil
e\V22262-01\sample-dirprm\ffwriter.properties"
E:\GoldenGate11gMediaPack\GGFlatFile\V22262-01\ggs_Windows_x86_Generic_32bit_v11
_1_1_0_0_078\extract.exe running with user exit library E:\GoldenGate11gMediaPac
k\GGFlatFile\V22262-01\flatfilewriter.dll, compatiblity level (2) is current.
SOURCEDEFS E:\GoldenGate11gMediaPack\V26071-01\dirdef\vikstkFF.def
table ggs.vikstk;
CACHEMGR virtual memory values (may have been adjusted)
CACHEBUFFERSIZE: 64K
CACHESIZE: 1G
CACHEBUFFERSIZE (soft max): 4M
CACHEPAGEOUTSIZE (normal): 4M
PROCESS VM AVAIL FROM OS (min): 1.87G
CACHESIZEMAX (strict force to disk): 1.64G
Started Oracle GoldenGate for Flat File
Version 11.1.1.0.0
** Run Time Messages **
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....
Thanks,
VikasOk, I haven't run your example, but here are some suggestions.
Vikas Panwar wrote:
extract ffe711
userid ggs@bidb, password ggs12345
discardfile E:\GoldenGate11gMediaPack\V26071-01\dirrpt\EXTFF.dsc, purge
rmthost 10.180.182.77, mgrport 7816
rmtfile E:\GoldenGate11gMediaPack\V26071-01\dirdat\ffremote, purge, megabytes 5
ggsci> add extract FFE711, EXTTRAILSOURCE ./dirdat/oo
ggsci> add rmttrail ./dirdat/pp, extract FFE711, megabytes 20
ggsci> start extract FFE711
You of course need data captured from somewhere to test with. You could capture changes directly from a database and write those to a trail, and use that as a source for the flat-file writer; or, if you have existing trail data, you can just use that (I often test with old trails, with known data).
In your example, you are using a data pump that is doing nothing more than pumping trails to a remote host. That's fine, if that's what you want to do. (It's actually quite common in real implementations.) But if you want to actually capture changes from the database, then change "add extract ... extTrailSource" to be "add extract ... tranlog". I'll assume you want to use the simple data pump to send trail data to the remote host. And I will assume that some other database capture process is creating the trail dirdat/oo
Also... with your pump "FFE711", you can create either a local or remote trial, that's fine. But don't use a rmtfile (or extfile). You should create a trail, either a "rmttrail" or "exttrail". The flat-file adapter will read that (binary) trail, and generate text files. Trails automatically roll-over, the "extfile/rmtfile" do not (but they do have the same internal GG binary log format). (You can use a 'maxfiles' to force them to rollover, but that's beside the point.)
Also, <ul>
<li> don't forget your "table" statements... or else no data will be processed!! You can wildcard tables, but not schemata.
<li> there is no reason that anything would be discarded in a pump.
<li> although a matter of choice, I don't see why people use absolute paths for reports and discard files. Full paths to data and def files make sense if they are on the SAN/NAS, but then I'd use symlinks from dirdat to the storage directory (on Unix/Linux)
<li> both windows and unix can use forward "/" slashes. Makes examples platform-independent (another reason for relative paths)
<li> your trails really should be much larger than 5MB for better performance (e.g,. 100MB)
<li> you probably should use a source-defs file, intead of a dblogin for metadata. Trail data is by its very nature historical, and using "userid...password" in the prm file inherently gets metadata from "right now". The file-writer doesn't handle DDL changes automatically.
</ul>
So you should have something more like:
Vikas Panwar wrote:
extract ffe711
sourcedefs dirdef/vikstkFF.def
rmthost 10.180.182.77, mgrport 7816
rmttrail dirdat/ff, purge, megabytes 100
table myschema.*;
table myschema2.*;
table ggs.*;For the file-writer pump:
+5. on Flat file GGSCI prompt+
extract ffr711
CUSEREXIT flatfilewriter.dll CUSEREXIT passthru includeupdatebefores, params dirprm\ffwriter.properties
SOURCEDEFS dirdef/vikstkFF.def
table myschema.*;
table ggs.*;
ggsci> add extract ffr711, exttrailsource ./dirdat/pp
ggsci> start extract ffr711
Again, use relative paths when possible (the flatfilewriter.dll is expected to be found in the GG install directory). Put the ffwriter.properties file into dirprm, just as a best-practice. In this file, ffwriter.properties, is where you define your output directory and output files. Again, make sure you have a "table" statement in there for each schema in your trails.
Problem I am facing_
I am not sure where to find the generated flat file,
even the reports are showing there is no data at manager process
I am expecting replicat instead of extract at Flatfile FFR711.prm
I have done this much what to do give me some pointers.....The generated files are defined in the ffwriter.properties file. Search for "rootdir" property, e.g.,
goldengate.flatfilewriter.writers=csvwriter
csvwriter.files.formatstring=output_%d_%010n
csvwriter.files.data.rootdir=dirout
csvwriter.files.control.ext=_data.control
csvwriter.files.control.rootdir=dirout
...The main problem you have is: (1) use rmttrail, not rmtfile, and (2) don't forget the "table" statement, even in a pump.
Also, for the flat-file adapter, it does run in just a "extract" data pump; no "replicat" is ever used. The replicats inherently are tied to a target database; the file-writer doesn't have any database functionality.
Hope it helps,
-m -
File content Conversion Issuse for a Sender File Adapter
Hi All ,
I am working for a Migration project , so my File structure will be as follows
<HEADER>
<DATA1>
<DATA2>
<DATA3>
<HEADER>
<ITEM>
<ITEM1>
<ITEM2>
<ITEM3>
<ITEM>
WHERE i do have flat file will be as follows
10001,20081902,US
10,soda,1
30,soda,4
40,soda,5
10002,20081902,US
10,steel,1
30,steel,4
40,steel,5
how to pick this file using FCC from sendere flat file adapter without keyfields , and i can use fixed length alsoSridhar,
You can use a work around like this. Create a generic Data Type something like below.
<Message Type>
<Recordset> - root node
<ROW> - element with occurence 0..unbounded
Now create an outbound MI with the above MT, in FCC in sender file adapter, give Recordset structure as ROW, *
This way , you will read the entire file inside XI as an XML like below
<ROW> header record </ROW>
<ROW> item record 1 </ROW>
<ROW> item record 2 </ROW>
<ROW>....</ROW>
In XI , since your header row & item rows are fixed length, now, you can use that in a UDF in your message mapping , or parse the xml using DOM parser in Java mapping, you should be good. Hope this helps.
~Saravana -
Sender file Adapter without Key field
my File structure will be as follows
<HEADER>
<DATA1>
<DATA2>
<DATA3>
<HEADER>
<ITEM>
<ITEM1>
<ITEM2>
<ITEM3>
<ITEM>
WHERE i do have flat file will be as follows
10001,20081902,US
10,soda,1
30,soda,4
40,soda,5
10002,20081902,US
10,steel,1
30,steel,4
40,steel,5
how to pick this file using FCC from sendere flat file adapter without keyfields , and i can use fixed length alsoHi,
with multiple nodes structure( like header and item in your case) you need to have a key field to process this file to a desidered structure using FCC.
Alternatively you can pick all these records as a single node type and classify it as header/items in your mapping by identifiing how each one differs frm other.
~SaNv... -
What are the flat files formats accepted for import and export in mdm?
How MDM recognize the document type/mapping to perform of incoming file (file name, folder) ?
What are the flat file format for export (delimited, structured, XML) ?
Does MDM handle Header in a flat file (import and export) ?
Does MDM handle several line definition in a flat file (import and export) ?Hi Joesph,
Here are answers to your questions:
<b>Ans 1--></b> As previously told by adhappan ,you can import data using import manager from the following formats:
Access--Means From microsoft access
Excel -- From an excel file
ODBC--this is generally used to import data from a flat file.BY flat file i mean to say a tab comma seperated "csv " file or a ".txt "file.
Port--to import data from a port .In mdm, port actually refers to a directrory
XML--from an xml file
XML Schema-- this is used when you try to import data from a file whosr structure you have predefined in console using an xsd file.
<b>Ans 2--></b> we specify the file format while connecting to source by mentioning it in the <b>type</b> properties.and mapping is performed in the import manager in the map value tab.
<b>Ans 3--></b>When you import or export a flat file then data in the first line of file is considered as header.
<b>Ans 4 --></b> As i previously told ,MDM will handle header.
<b> Ans 5 --></b>MDM does not handle several line definition of header.
Hope it will help you.<b> Please remark if it really helped you</b>.
Thanks,
<b>Shiv Prashant Dixit</b> -
Hello,
I am new on golden gate and we search for the mechanism of flat file adapter on golden gate.
Do you know if we can apply the transactional changes from oracle database to an another DBMS, by using flat file adapter?
I mean:
- Are datas extracted to a flat file. If yes, from where can we follow the database changes?
I mean imagine:
There is insert, update and delete in the source database.
-is the transaction order of the operation written to a file? Are also the datas written to a file?
-Is it possible for an another application to understand and apply them to an another DBMS?
Could you please guide me?
Thanksuser12299628 wrote:
I am new on golden gate and we search for the mechanism of flat file adapter on golden gate.Oracle GoldenGate can "capture" transactions on a source database, and writes them out to a binary "trail" file. This trail file can be read by the OGG "flat file adapter", which converts this internal, binary format into plain text files that can be loaded into another DB, ETL tool, etc.
See: (latest release, flat-file adapter 11.1.1) http://docs.oracle.com/cd/E18101_01/index.htm
http://www.oracle.com/us/products/middleware/data-integration/goldengate/application-adapters/overview/index.html
Do you know if we can apply the transactional changes from oracle database to an another DBMS, by using flat file adapter?Yes, but usually there is no reason to... You can capture transactions from one DB (e.g., Oracle or MySQL or SQL Server or DB2) and just apply them normally do another database, even a different vendor on a different OS (Oracle, MySQL, SQL Server, DB2, Teradata, etc).
I mean:
- Are datas extracted to a flat file. If yes, from where can we follow the database changes?Yes, data is captured & written out to flat (text) files. The deltas (inserts/updates/deletes) are in the text files. You can use GG basic stats & reports to keep track of number of operations/tranactions captured.
I mean imagine:
There is insert, update and delete in the source database.
-is the transaction order of the operation written to a file? Are also the datas written to a file?Yes, and yes.
-Is it possible for an another application to understand and apply them to an another DBMS?It's up to you to define the output format written to the file, using properties. For example, you can specify the delimiter, quoting mechanism, escape characters, etc.
Edited by: MikeN on Apr 2, 2012 2:06 PM -
Send Idoc flat file message in JMS receiver adapter
Hello,
I am working on a scenario where we send Delvry 03 idoc from ECC to external system. To external system, we send the whole Idoc in flat file structure through JMS queue. I have used Idoc to flat file program in my interface mapping and have configured JMS receiver adapter with just default Module configs but, I am getting an error in communication channel monitoring for the messages.
Please note that Idoc xml to flat file structure conversion is already done in the mapping, I need to just pass this idoc flat structure in the jms adapter. Hence ther eis no content conversion i nthe adapter..
Please give some inputs..Here are the modules in my receiver adapter and nothing else..
SAP XI JMS Adapter/ConvertMessageToBinary
SAP XI JMS Adapter/SendBinarytoXIJMSService
The error I get is in audit log
Message processing failed. Cause: com.sap.aii.af.ra.ms.api.RecoverableException: No transition found from state: ERROR, on event: process_commence for DFA: CC_JMS_R:ca336a6689f837da8bd3387140fc4447
in turn the message has this error if I open the message
Whitespace is not allowed at this location. Error processing resource 'http://host:port/mdt/messageconten...
and it shows one of the lines from idoc..flat file
Any idea is greatly appreciated..Thank you..ThanujjaThe difference in what I suggest is that it is way simpler.
Maybe you did not understand this, so will try to explain it better. Its not the best thing to do, but if JMS adapter doesn't budge then you can give it a shot.
1. You continue to use the ABAP mapping in your interface mapping to map the Idoc xml to Idoc flat.
2. Write a Java Map that will take the output of the ABAP mapping ; and then create a XML output which would be something like,
<Root>
<Idoc_Flat>
<Data>ABAP Mapping Output<Data>
</Idoc_Flat>
</Root>
3. Now use the simple content conversion in the JMS adapter to convert this to flat file.
Regards
Bhavesh -
Reading huge flat file through SOAP adapter
Hi Everybody,
In one of our interface we need to read big flat file using soap adapter at sender side into xi and we are using java map to convert into xml. but before that i need to split this flat file into multiple files in the first message mapping. and in the second map we have to write a java map to do the flat file conversion to XMLBut i got struck up in reading this big flat file into XI as i need to declare some datatype to read this entire file. Can anybody tell me how i can do this. is it a possible to do first of all with SOAP adapter .
Thanks
rajhi vijay,
Thanks for your prompt reply. Due to some reasons i am not allowed to use file adapter . i can use only JMS adapter or SOAP adapter. we tried few scenarios with JMS content conversion but what ever scenario i am asking here is complex at multilevel i can't even use JMS in this case. so we are thinking to read whole file using SOAP adapter and then we are planning to split the file into multiple files, as file can be huge size ,using java mapping and in next level we want to use another mapping to do content conversion. SO I have to do experiements whether this is a feasible solution or not. because when u declare at sender side
<ffdata_MT>
<Recordset>
<ROW> String type
when u declare like this and when u sent the flat file using SOAP adapter at sender side we are getting whole file which we sent at part of "ROW" as string. but inside java mapping i need to see whenther i can split this in XI ,so that i can use these split files in next mapping for content conversion. Hope i am clear now. I want to know whether it is a feasible solution or not.
I really appreciate if sombody give some idea on this
Thanks
raj -
Convert XML to flat file with File adapter
Hi all.
Trying to configure a file adapter according to the following link.
http://help.sap.com/erp2005_ehp_04/helpdata/EN/d2/bab440c97f3716e10000000a155106/frameset.htm
What i want it to do is the following.
I have the following incoming message:
<header>
<h_field1></h_field1>
<h_field2></h_field2>
</header>
<data>
<d_field1></d_field1>
<d_field2></d_field2>
<infotext>
<i_field1></i_field1>
<i_field2></i_field2>
</infotext>
</data>
I want this to become a flat file that looks as follow.
h_field1¤h_field2
d_filed1¤d_field2
i_field1¤i_filed2
Every field in each segment should be concatenated with the ¤ char as separator.
I've done the following:
Content Conversions Parameters
Record Structure: header,data,infotext
header.fieldSeparator ¤
data.fieldSeparator ¤
infotext.fieldSeparator ¤
The adapter just gets into wait mode.
Does anybody know why?
BR
KalleHello kalle,
As per your source structure, if you perform FCC on it you should have 2 level hierarchie, Your RecordSet will be your root node.
and the RecordSetStructure is : header,1,data,1,
Since the node infotext is in the data, you can not refer this node and is not appropriate.
<header>
<h_field1/>
<h_field2/>
</header>
<data>
<d_field1/>
<d_field2/>
<infotext>
<i_field1/>
<i_field2/>
< /infotext>
</data>
Change The above strucutre to like this
<header>
<h_field1/>
<h_field2/>
</header>
<data>
<d_field1/>
<d_field2/>
</data>
<infotext>
<i_field1/>
<i_field2/>
< /infotext>
Your RecordSet will be your root node.
and the RecordSetStructure is : header,1,data,1,infotext,1
This resolves your issue..
Regards,
Prasanna -
FTP Adapter: Inbound and Outbound flat files
Having lots of trouble trying to Send messages (via Subscribe event) with my FTP adapter, with this specific error:
Thu May 12 11:39:10 MDT 2005: Bridge { agent=oracle.oai.agent.client.AgentImpl@2f48d2 application=PAGOFTPAPP partition=null active=true #d3ls=1 } cannot handle OAI message of type newFacility.
I am successfully processing my flat file messages into OAI (via Publish event) and they are being properly written to the OAI_HUB_QUEUE. The message gets all the way to the Bridge, which posts the above error. Can anyone tell me how the Bridge attempts to determine what to do with the messages?
In creating the subscribe event I specified my desired D3L file as the application view, before doing the mapping. I then tried creating mappings at both the top (struct via ObjectCopy) and granular (fields via CopyFields) levels, and always get the same error.
Under "Modify Fields" for the subscribe event I confirmed that ota.isDL3=true and ota.d3lPath points to the D3L XML file that I used to specify the application view (and incidentally is the same one used to process the files that I am processing with the Publish event). I'm trying to get up to speed with OAI and my first test is simply to read in a tab-delimited file and turn around and write it back out to the same structure (but of course to a different target directory).
Much thanks for any assistance anyone can offer! Due to some kind of administrative hiccup we seem to have lost our iTAR capability and it's taking forever to get it back (of course at the worst possible time...)
RichardRichard,
Not sure if you have found a way around this problem yet, but it sounds like you may have a mismatch bewteen iStudio and your D3L XML that's used to define and transform the message.
In the XML, name should be set to your iStudio defined 'event name', object should be set to the 'business object' and the type should be set to related 'App Data Type'.
As an example of a subscribed event
Subscribe(GLCosts.maintain_costs)
using an ADT called CostRecord...it's a good idea to name your message the same as the type when defining the Application view using the Subscribe Wizard, i.e. don't manually change anything.
So your message header in the D3L XML would look like this:
<message name="maintain_costs" type="CostRecord" object="GLCosts">
HTH
Nick -
Filename for flat file using J2EE FILE FTP Receiver adapter
Hi there,
I am struggling to do the following:
I have a J2EE File receiver that sends a file with a specific name to a FTP destination. I define the target filename in my graphical mapping using certain criteria and a incremental number. In my adapter I use variable substitution to select that value in the payload as the filename. Up to this point everything works fine.
The problem is that I convert the payload to a flat structure using xslt before writing it out and because of that the adapter cannot find my value as defined in the variable I use for the name.
Now, obviously if I move the xslt module after the CallAdapter module, the file won't be converted to a flat structure.
Can anyone give me advice on how I can do the flat conversion <u>and</u> the specific filename from the payload?
Thanks in advance,
JohanHi,
Instead of using variable substitution, use adapter specific identifiers to set the file name in mapping.
Ref:/people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
Regards,
Jai Shankar -
What is the diff b/w flat file and legacy system?
Hi everyone,
when v say v r working on scenario FILE to FILE? which format of file r v usually working on? and what is the diff b/w flat file and legacy system?
thanxHi,
<i>when v say v r working on scenario FILE to FILE? which format of file r v usually working on?</i>
>>>Many a times it will be a Flat file with CSV format,Tab delimited format, fixedlength fields.
<i>what is the diff b/w flat file and legacy system?</i>
>>>We can not differeniate like this..
Flat file may come from any systems, it may be live system or legacy system.
Legacy system- is something like old, or past one. If you talk about SAP , then older versions of SAP can be called as a legacy system.
So it may be a file system, or any system which is old version but it doesnot mean it is not in use,
Regards,
Moorthy
Maybe you are looking for
-
HT204022 How to erase photos from photo stream on my iPhone?
How to erase photos from photo stream on my iPhone?
-
Error running applet within a jar
I compile the jar files together with the smack.jar and smackx.jar and then signed the applet try to run with this code for my HTML file <html> <head> <meta HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE"> <meta HTTP-EQUIV="Expires" CONTENT="-1"> <titl
-
friends. I have 100GB dump and going to import into the database.I need to know what will happen in the backupgroud of the database when running the import interms of redo log and temp.space..etc. Could you please elabarate on this. Thanks
-
Synch Issues : iPod iTunes Desktop
I reinstalled the OS on my laptop (XP Pro). When I did, I had to re-install iTunes. Now when I try to synchronize my iPOD Nano, the iTunes Desktop does not recognize the Music (Albums) on my iPOD. I really want to do a one way music sync from my iPOD
-
Time Machine unable to Restore to internal Hard Drive
My internal Hard Drive (formatted, Mac OS X Journaled, GUID Partition Table) doesn't show up on the list of Hard Drives to Restore to using "Restore System from Time Machine" program during the installation process. I've formatted my internal drive,