Dynamic File Structure - Urgent
Hi,
We have a requirement to use ODI to develop a POC that will do the following.
1. Create a Job to Load target from different file sources having different layouts or data structures.
2. The structure of the file needs to be fed to the job dynamically so the job can do a dynamic transformation and load to the target
3. An important requirement is to not change the job every time the input changes. The target will remain the same
The challenge is how to pass the schema of the file dynamically to the job.
This can be achieved in Datastage is our understanding but the POC is to showcase that Oracle Data Integrator also has this capability.
Any information docs or prior poc will be of great help
Appreciate your response!!!
-Basheer
I will leave the choice completely to Basir for going with implementation , but here i am sharing what i have thought about.
ODI Procedure -
Step 1 - To read the above File and creating the CTL file for SQLLDR
Technology- Jython
import re,os,string
# Reading the File
file_read=open('<%=odiRef.getOption("FILE_PATH")%>/Dynamic_struc.txt','r')
line=file_read.readline()
collect=[]
#Looping through each Line
while line:
# Reading Delimiter
if re.search("delim",line):
delimiter= line[-5:-2]
# reading Columns
if ((string.find(line,';')> 0) & (string.find(line,')')> 0)):
value=line[0:string.find(line,';')]
collect.append(string.strip(value))
line=file_read.readline()
column=','.join(map(string.strip,collect))
output= ' The delimiter of this file is '+delimiter +' The columns are '+column
file_read.close()
# Creating a CTL File for Loading into Temporary Table
file_write=open('<%=odiRef.getOption("FILE_PATH")%>/temp_file.ctl','w')
print >> file_write , 'OPTIONS (SKIP=1,ERRORS=0,DIRECT=FALSE ) LOAD DATA '
print >> file_write , 'INFILE "<%=odiRef.getOption("FILE_PATH")%>/temp_file.txt" '
print >> file_write , 'BADFILE "<%=odiRef.getOption("FILE_PATH")%>/temp_file.bad"'
print >> file_write , 'DISCARDFILE "<%=odiRef.getOption("FILE_PATH")%>/temp_file.dsc" '
print >> file_write , 'DISCARDMAX 1'
print >> file_write , 'INTO TABLE ODI_TEMP_10G."file_temp" '
print >> file_write , 'FIELDS TERMINATED BY '+delimiter
print >> file_write , 'TRAILING NULLCOLS'
print >> file_write ,'('+column+')'
file_write.close() ODI Option - FILE_PATH i.e path of the directory where the file Dynamic_struc.txt resides
Second Step in Procedure - Truncating Temporary Table Technology- Oracle and Schema - Respective schema where the Temporary table resides
TRUNCATE TABLE TEMPORARY TABLE Third Step in ODI Procedure - Calling SQLLDRCommand on Target - Technology- Sunopsis API
Command on Source- Technology Oracle and Schema - Respective schema where the Temporary table resides [ Reason so that i can fetch the userid and password for SQLLDR from ODI itself ]
OdiOSCommand "-OUT_FILE=<%=odiRef.getOption("FILE_PATH")%>/TEMP_FILE.out" "-ERR_FILE=<%=odiRef.getOption("FILE_PATH")%>/TEMP_FILE.err"
sqlldr "control='<%=odiRef.getOption("FILE_PATH")%>/TEMP_FILE.ctl'" "log='<%=odiRef.getOption("FILE_PATH")%>/TEMP_FILE.log'" userid=<%=odiRef.getInfo("SRC_USER_NAME")%>/<%=odiRef.getInfo("SRC_PASS")%>@<%=odiRef.getInfo("SRC_DSERV_NAME")%>
Sample Output Process.
Dynamic_struc.txt - The Metadata File
record
{final_delim=end, delim=';'}
AGE_MIN;VARCHAR2(10),
AGE_MAX;VARCHAR2(10),
AGE_RANGE;VARCHAR2(10),
)Source Data
AGE_MIN;AGE_MAX;AGE_RANGE
0;19;Less than 20 years
20;29;20-29 years
30;39;30-39 years
40;49;40-49 years
50;59;50-59 years
60;110;60 years or moreCTL File Generated
OPTIONS (SKIP=1,ERRORS=0,DIRECT=FALSE ) LOAD DATA
INFILE "C:/temp_file.txt"
BADFILE "C:/temp_file.bad"
DISCARDFILE "C:/temp_file.dsc"
DISCARDMAX 1
INTO TABLE ODI_TEMP_10G."file_temp"
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(AGE_MIN,AGE_MAX,AGE_RANGE)I have tested the process by reducing columns too and random too and it seems to be working although not all condition :)
I have uploaded the procedure and here is the link where you can download.
http://cid-d7195c8badc40d0b.office.live.com/embedicon.aspx/odiexperts.com/TRT^_LOADING^_DYNAMIC^_DATA.xml
Please let me know if you need any help. You would need to tweak lots of things according to your environment .
Hope this helps
Similar Messages
-
Dynamic file name from input payload (RFC 2 flat file)
Hi,
I have an RFC to flat file scenario. The output flat file has not an XML structure, it's just a plain text file generated with abap mapping.
In my source interface (RFC), I have a field called <FILENAME>, I want to use the value of that field to create the target file using dynamic file name. But if in variable substitution I use payload:ZRFC_NAME,1,FILENAME,1 It doesn't work because the dynamic variable substitution try to access to output payload, not the source one...
What can I do?Hi Marshal,
You can add a extra node to your target strucutre like
FileName- Node
--FileName - Element.
do the mapping from the field filename of RFC to FileName field in u r target strucure. And use this field path at Refrence in variable subtituion.
In the Content converison add the Name & Values as below
FileName.fileldNames -- FileName
FileName.fieldFixedLengths -- 0
FileName.fixedLengthTooShortHandling -- Cut
So the extra field in u r target structure would not populate in u r target text file.
Cheers
Veera -
Xml file in dynamic file name in file receiver adapter
Hi,
I'm doing the dynamic file name in file receiver adapter. I have done as per instructed in /people/jayakrishnan.nair/blog/2005/06/20/dynamic-file-name-using-xi-30-sp12-part--i
All turned out okay. I have got the file name I require. Except that the file format is XML and I need to suppress the filename node occupied by the dynamic file name.
The content conversion mentioned in /people/sravya.talanki2/blog/2005/08/11/solution-to-the-problem-encountered-using-variable-substitution-with-xi-sp12, does not seem to solve my problem. As it is only for file format other than the XML one, because we only do the content conversion if we want to "convert" the format of the content from XML to the other format.
Does anybody have the solution to my problem? thanks in advanceThank you Raj for the direction
The way to do it is :
1. In ID, advanced tab, put a check on the adapter-specific message attributes - file name.
2. Put a "*" on the file name scheme
3. In IR, create a UDF to set up target file name :
DynamicConfiguration conf = (DynamicConfiguration) container
.getTransformationParameters()
.get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
// set up file name for receiver adapter
String SourceFileName = conf.get(key);
conf.put(key, TargetFileName);
return " ";
4. Map the above UDF to the header level of the target structure.
Regards,
Idi -
How to give a dynamic File Name for Receiver File/FTP Adapter.
Hi Experts,
I have one scenario in which we are creating a flat file of IDOC which is coming from R/3 & sending it to FTP location. For this we have configured Receiver FTP adapter with File Name Scheme as "NT.out" & in File Consturction mode i have given as "Add Time Stamp".
therfore while creating a file it is creating as NTyyyyMMdd-HHmmss-SSS.out
where as my requirement is only to to add Time & not the Date. (NThhmmss.out)
How to do this ?
for your info we are using ABAP Mapping.
Pl help me
Regards,
UmeshHi Umesh,
Add one more field to your target structure for your file name and populate that field as per your requirement like NTyyyyMMdd.out. In receiver communication channel use Variable subtiution option and give the refrence of Payload and file construction mode set as create.
And refer the below weblogs for Variable Subtiutuion File Name Scheme
/people/jayakrishnan.nair/blog/2005/06/20/dynamic-file-name-using-xi-30-sp12-part--i - Dynamic File Name Part 1
/people/jayakrishnan.nair/blog/2005/06/28/dynamic-file-namexslt-mapping-with-java-enhancement-using-xi-30-sp12-part-ii - Dynamic File Name Part 2
Hope this way would be solve u r problem.
Cheers
Veera
>>>Reward points, if it is needful -
BPM/File Content and Dynamic File Name
Bit of a double whammy here!
I am running a BPM : R/3 Proxy > XI > BPM > File Adapter (x4)
All working to a point.....
The file I receive splits into two files, then I pass a Count file - number of records to a separate file and then a file that simply says "ready" on completion. Four files are:
Data file, Lookup (values and descripton for legacy), Count file and Ready file.
To top this off, it will be run for 12 separate countries so hope to use the country code to add to the file naming - so generating 48 files (if successful). Data_GB.dat, lookup_GB.dat, Count_GB.dat and ready_GB.dat etc....
The first three, I can do (with maybe a little help!), I have looked at the use of dynamic file naming and think I can do this, however, the count file uses the initial file sent, but the ready file passes across the word "ready" to destination file. How can I utilise the dynamic naming convention if I do not use the original file for this?
Also, the "ready" file is being created but it is empty. It is okay using the the File Adapter, but when I convert it, the file is empty (but still created!) IN sxmb_moni, it is showing correctly but once the file content conversion happens, it is empty!
Any thoughts?Thanks Bhavesh.
Time Stamp Status Description
2007-05-08 17:43:57 Success Message successfully received by messaging system. Profile: XI URL: http://host.fqdn.net:55000/MessagingSystem/receive/AFW/XI Credential (User): XIISUSER
2007-05-08 17:43:57 Success Using connection File_http://sap.com/xi/XI/System. Trying to put the message into the receive queue.
2007-05-08 17:43:57 Success Message successfully put into the queue.
2007-05-08 17:43:57 Success The message was successfully retrieved from the receive queue.
2007-05-08 17:43:57 Success The message status set to DLNG.
2007-05-08 17:43:57 Success Delivering to channel: EPIW_FTP_Receiver_EmployeeReady
2007-05-08 17:43:57 Success MP: entering
2007-05-08 17:43:57 Success MP: processing local module localejbs/CallSapAdapter
2007-05-08 17:43:57 Success File adapter receiver: processing started; QoS required: ExactlyOnce
2007-05-08 17:43:57 Success File adapter receiver channel EPIW_FTP_Receiver_EmployeeReady: start processing: party " ", service "XE_DEV_3RD_EPIW_001"
2007-05-08 17:43:57 Success Connect to FTP server "ftp.ftp.ftp.ftp", directory "/ECS/Target"
2007-05-08 17:43:57 Success Write to FTP server "ftp.ftp.ftp.ftp", directory "/ECS/Target", file "xi_epiw_ready20070508-174357-635.dat"
2007-05-08 17:43:57 Success Transfer: "BIN" mode, size 156 bytes, character encoding -
2007-05-08 17:43:57 Success Start converting XML document content to plain text
2007-05-08 17:43:57 Success File processing complete
2007-05-08 17:43:57 Success MP: leaving
2007-05-08 17:43:57 Success The message was successfully delivered to the application using connection File_http://sap.com/xi/XI/System.
2007-05-08 17:43:57 Success The message status set to DLVD.
<?xml version="1.0" encoding="UTF-8" ?>
- <ns0:EPIWReadyFile xmlns:ns0="urn:com.xerox.esap.hr.epiwextract">
<recordReady>READY</recordReady>
</ns0:EPIWReadyFile>
Transfer: Binary
File Type : Text
File Content Conversion:
Record Set Structure: Detail
Detail.fieldSeparator ,
Detail.endSeparator 'nl'
It maybe something simple...
Hopefully you can help : )
Thanks
Barry -
How to Create Hierarchy From Flat file Structure
Hi Gurus,
There is a scenario for me regarding the Hierarchy.
Required Hierarchy structure - Region>Director>Manager-->Sales id
I have flat file which gives the info like user id , sales id , manager id, director id.
But the transaction data Flat file has structure with sales id, region id, sales amt, sales qty.
Note : Region id is another Master Data.
How i can create hierarchy from the first flat file which doesnot have region info in that but it is available in the transaction data Flat file.
Is there anyway we can create hierarchy based on the first Flat file structure which contains more that 1,00,000 records.
Try to Suggest me in this regard .
This is urgent.
Regards,
ManoHi Mano,
Defining the source system from which to load data
Choose the source system tree File ® Create.
2. Defining the InfoSource for which you want to load data
Optional: choose InfoSource Tree ® Root (InfoSources) ® Create Application Components.
Choose InfoSource Tree ® Your Application Component ® Other Functions ® Create InfoSource 3.x ® Direct Update.
Choose an InfoObject from the proposal list, and specify a name and a description.
3. Assigning the source system to the InfoSource
Choose InfoSource Tree ® Your Application Component ® Your InfoSource ® Assign Source System. The transfer structure maintenance screen appears.
The system automatically generates DataSources for the three different data types to which you can load data.
○ Attributes
○ Texts
○ Hierarchies (if the InfoObject has access to hierarchies)
The system automatically generates the transfer structure, the transfer rules, and the communication structure (for attributes and texts).
4. Maintaining the transfer structure / transfer rules
Select the DataSource for uploading hierarchies.
IDoc transfer method: The system automatically generates a proposal for the DataSource and the transfer structure. This consists of an entry for the InfoObject for which hierarchies are loaded. With this transfer method, the structure is converted to the structure of the PSA during loading, which affects performance.
PSA transfer method: The transfer methods and the communication structure are also generated here.
5. Maintaining the hierarchy
Choose Hierarchy Maintenance, and specify a technical name and a description of the hierarchy
Hope this helps
Regards
Karthik
Assign points if Helpful -
Strange problem in Dynamic File Name . XI behaving strangely
My o/p is coming like this
<?xml version="1.0" encoding="UTF-8" ?>
- <ns0:MT_PurchaseOrderChange xmlns:ns0="http://E2open.com/xi/IntercompanySCM_6.0/POChange">
- <recordset>
- <data>
<PoHeaderDomainName>broker_domain</PoHeaderDomainName>
<PoHeaderOrgName>broker_org</PoHeaderOrgName>
<PoHeaderPoState>$null</PoHeaderPoState>
<PoHeaderStateChangeDate>$null</PoHeaderStateChangeDate>
<PoHeaderPoNumber>4500000026</PoHeaderPoNumber>
<PoHeaderPoCreationDate>$null</PoHeaderPoCreationDate>
<PoHeaderLastModifiedDate>$null</PoHeaderLastModifiedDate>
<PoHeaderModelSubType>Discrete_Order</PoHeaderModelSubType>
<PoHeaderSupplierName>0000352119</PoHeaderSupplierName>
<PoHeaderSupplierDescription>$null</PoHeaderSupplierDescription>
<PoHeaderCustomerName>SSP_CUSTOMER</PoHeaderCustomerName>
<PoHeaderCustomerDescription>$null</PoHeaderCustomerDescription>
<PoHeaderCustomerMessage>$null</PoHeaderCustomerMessage>
<PoHeaderSupplierMessage>$null</PoHeaderSupplierMessage>
<PoHeaderBillToName>$null</PoHeaderBillToName>
<PoHeaderBillToAddressDescriptor>$null</PoHeaderBillToAddressDescriptor>
<PoHeaderBillToAddressAddress1>$null</PoHeaderBillToAddressAddress1>
<PoHeaderBillToAddressAddress2>$null</PoHeaderBillToAddressAddress2>
<PoHeaderBillToAddressCity>$null</PoHeaderBillToAddressCity>
<PoHeaderBillToAddressCountry>$null</PoHeaderBillToAddressCountry>
<PoHeaderBillToAddressCounty>$null</PoHeaderBillToAddressCounty>
<PoHeaderBillToAddressState>$null</PoHeaderBillToAddressState>
<PoHeaderBillToAddressZip>$null</PoHeaderBillToAddressZip>
<PoHeaderBuyerCode>$null</PoHeaderBuyerCode>
<PoHeaderFreight>$null</PoHeaderFreight>
<PoHeaderTerms>0001</PoHeaderTerms>
<PoHeaderOrderPriority>$null</PoHeaderOrderPriority>
<PoHeaderCommunicationMode>$null</PoHeaderCommunicationMode>
<PoHeaderAgreementStartDate>$null</PoHeaderAgreementStartDate>
<PoHeaderAgreementEndDate>$null</PoHeaderAgreementEndDate>
<UDFPoHeaderRevisionNumber>$null</UDFPoHeaderRevisionNumber>
<UDFPoHeaderERPPOCreationDate>01Sep2006000000</UDFPoHeaderERPPOCreationDate>
<UDFPoHeaderheaderUDF1>$null</UDFPoHeaderheaderUDF1>
<UDFPoHeaderheaderUDF2>$null</UDFPoHeaderheaderUDF2>
<UDFPoHeaderheaderUDF3>$null</UDFPoHeaderheaderUDF3>
<UDFPoHeaderheaderUDF4>$null</UDFPoHeaderheaderUDF4>
<UDFPoHeaderheaderUDF5>$null</UDFPoHeaderheaderUDF5>
<PoLineItemPoLineItemId>00020</PoLineItemPoLineItemId>
<PoLineItemCustomerItemName>E107434516</PoLineItemCustomerItemName>
<PoLineItemCustomerItemDesc>L1357 TOROID INDUCTOR</PoLineItemCustomerItemDesc>
<PoLineItemCustomerDomainName>SSP_CUSTOMER_domain</PoLineItemCustomerDomainName>
<PoLineItemSupplierItemName>E107434516</PoLineItemSupplierItemName>
<PoLineItemSupplierItemDesc>L1357 TOROID INDUCTOR</PoLineItemSupplierItemDesc>
<PoLineItemSupplierDomainName>0000352119_domain</PoLineItemSupplierDomainName>
<PoLineItemUnitPrice>720</PoLineItemUnitPrice>
<PoLineItemBasisOfUnitPrice>$null</PoLineItemBasisOfUnitPrice>
<PoLineItemCurrency>EUR</PoLineItemCurrency>
<PoLineItemUnitOfMeasure>TNE</PoLineItemUnitOfMeasure>
<PoLineItemLineItemState>$null</PoLineItemLineItemState>
<PoLineItemStateChangeDate>$null</PoLineItemStateChangeDate>
<PoLineItemLastModifiedDate>$null</PoLineItemLastModifiedDate>
<UDFPoLineSupplierItemName>E107434516</UDFPoLineSupplierItemName>
<UDFPoLinelineUDF1>$null</UDFPoLinelineUDF1>
<UDFPoLinelineUDF2>$null</UDFPoLinelineUDF2>
<UDFPoLinelineUDF3>$null</UDFPoLinelineUDF3>
<UDFPoLinelineUDF4>$null</UDFPoLinelineUDF4>
<UDFPoLinelineUDF5>$null</UDFPoLinelineUDF5>
<PoScheduleId>1</PoScheduleId>
<PoScheduleLastAction>Insert_Or_Modify</PoScheduleLastAction>
<PoScheduleScheduleState>$null</PoScheduleScheduleState>
<PoScheduleStateChangeDate>$null</PoScheduleStateChangeDate>
<PoScheduleRequestQuantity>9.000</PoScheduleRequestQuantity>
<PoScheduleRequestDate>16Sep2006000000</PoScheduleRequestDate>
<PoScheduleRequestShipmentDate>16Sep2006000000</PoScheduleRequestShipmentDate>
<PoScheduleOriginalRequestQuantity>$null</PoScheduleOriginalRequestQuantity>
<PoScheduleOriginalRequestDate>$null</PoScheduleOriginalRequestDate>
<PoScheduleCarrier>$null</PoScheduleCarrier>
<PoScheduleCarrierMode>$null</PoScheduleCarrierMode>
<PoScheduleCarrierAccountNumber>$null</PoScheduleCarrierAccountNumber>
<PoScheduleCustomerSiteName>5302</PoScheduleCustomerSiteName>
<PoScheduleShipToAddressDescriptor>$null</PoScheduleShipToAddressDescriptor>
<PoScheduleShipToAddressAddress1>$null</PoScheduleShipToAddressAddress1>
<PoScheduleShipToAddressAddress2>$null</PoScheduleShipToAddressAddress2>
<PoScheduleShipToAddressCity>$null</PoScheduleShipToAddressCity>
<PoScheduleShipToAddressCountry>$null</PoScheduleShipToAddressCountry>
<PoScheduleShipToAddressCounty>$null</PoScheduleShipToAddressCounty>
<PoScheduleShipToAddressState>$null</PoScheduleShipToAddressState>
<PoScheduleShipToAddressZip>$null</PoScheduleShipToAddressZip>
<PoScheduleLastModifiedDate>$null</PoScheduleLastModifiedDate>
<PoScheduleCustomerMessage>$null</PoScheduleCustomerMessage>
<PoScheduleSupplierMessage>$null</PoScheduleSupplierMessage>
<PoScheduleRefdPoCustomerName>$null</PoScheduleRefdPoCustomerName>
<PoScheduleRefdPoS>$null</PoScheduleRefdPoS>
<PoScheduleRefdPoModelSubType>$null</PoScheduleRefdPoModelSubType>
<PoScheduleRefdPoNumber>$null</PoScheduleRefdPoNumber>
<PoScheduleRefdPoLineItemId>$null</PoScheduleRefdPoLineItemId>
<PoScheduleRefdPoScheduleId>$null</PoScheduleRefdPoScheduleId>
<UDFPoSchedulescheduleUDF1>$null</UDFPoSchedulescheduleUDF1>
<UDFPoSchedulescheduleUDF2>$null</UDFPoSchedulescheduleUDF2>
<UDFPoSchedulescheduleUDF3>$null</UDFPoSchedulescheduleUDF3>
<UDFPoSchedulescheduleUDF4>$null</UDFPoSchedulescheduleUDF4>
<UDFPoSchedulescheduleUDF5>$null</UDFPoSchedulescheduleUDF5>
<PoPromiseScheduleId>1</PoPromiseScheduleId>
<PoPromiseScheduleAddressDescriptor>$null</PoPromiseScheduleAddressDescriptor>
<PoPromiseScheduleAddress1>$null</PoPromiseScheduleAddress1>
<PoPromiseScheduleAddress2>$null</PoPromiseScheduleAddress2>
<PoPromiseScheduleAddressCity>$null</PoPromiseScheduleAddressCity>
<PoPromiseScheduleAddressCountry>$null</PoPromiseScheduleAddressCountry>
<PoPromiseScheduleAddressCounty>$null</PoPromiseScheduleAddressCounty>
<PoPromiseScheduleAddressState>$null</PoPromiseScheduleAddressState>
<PoPromiseScheduleAddressZip>$null</PoPromiseScheduleAddressZip>
<PoPromiseScheduleQuantity>9.000</PoPromiseScheduleQuantity>
<PoPromiseScheduleDate>16Sep2006000000</PoPromiseScheduleDate>
<PoPromiseScheduleShipmentDate>$null</PoPromiseScheduleShipmentDate>
<UDFPoPromiseScheduleUDF1>$null</UDFPoPromiseScheduleUDF1>
<UDFPoPromiseScheduleUDF2>$null</UDFPoPromiseScheduleUDF2>
<UDFPoPromiseScheduleUDF3>$null</UDFPoPromiseScheduleUDF3>
<UDFPoPromiseScheduleUDF4>$null</UDFPoPromiseScheduleUDF4>
<UDFPoPromiseScheduleUDF5>$null</UDFPoPromiseScheduleUDF5>
</data>
- <FileNameNode>
<FileName>111111111_222222223_purchase-orders_20060901065147_</FileName>
</FileNameNode>
</recordset>
</ns0:MT_PurchaseOrderChange>
I have specifed
Payload:MT_PurchaseOrderChange,1,FileNameNode,1,FileName,1
in variable substitution
I am on SP 14 and it the dynamic file is working for other scneario
bt in this scneario i am getting the error
Receiver Adapter v2405 for Party '', Service 'com_E2open_qas':
Configured at 2006-09-01 18:42:32 GMT+05:30
History:
- 2006-09-01 19:11:48 GMT+05:30: Message processing failed: Error during variable substitution: com.sap.aii.adapter.file.varsubst.VariableDataSourceException: The following variable was not found in the message payload: amit
- 2006-09-01 19:11:48 GMT+05:30: Processing started
- 2006-09-01 19:06:47 GMT+05:30: Message processing failed: Error during variable substitution: com.sap.aii.adapter.file.varsubst.VariableDataSourceException: The following variable was not found in the message payload: amit
- 2006-09-01 19:06:47 GMT+05:30: Processing started
- 2006-09-01 19:01:47 GMT+05:30: Message processing failed: Error during variable substitution: com.sap.aii.adapter.file.varsubst.VariableDataSourceException: The following variable was not found in the message payload: amit
amit is the variable i have given for substitution.Hi,
if you have SP14 stop using variable substitiuion:)
you can do it much easier with dynamic configuraiton
/people/william.li/blog/2006/04/18/dynamic-configuration-of-some-communication-channel-parameters-using-message-mapping
so you can easily set it for file adapter to set the file name
you can also have a look at my weblog:
/people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
Regards,
michal
<a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a> -
Using Adobe Bridge file structure with iPhoto (latest version)
I use Adobe Bridge and have all my pics in named folders in Pictures/PICS/Folder Names. Inside the PICS folder is the iPhoto Library (only a few pics in it). Is there any way I can use the file structure I have set up with Bridge and iPhoto (latest) simultaneously? I really dont want to import (copy) all my pics into IPhot because I am pretty sure I will end up with two versions of each. I havent been able to manage pics manually the way I like to in older versions of iPhoto.
Here's some info to help you setup Photoshop for use with iPhoto:
Using Photoshop or Photoshop Elements as Your Editor of Choice in iPhoto.
1 - select Photoshop or Photoshop Elememts as your editor of choice in iPhoto's General Preference Section's under the "Edit photo:" menu.
2 - double click on the thumbnail in iPhoto to open it in Photoshop. When you're finished editing click on the Save button. If you immediately get the JPEG Options window make your selection (Baseline standard seems to be the most compatible jpeg format) and click on the OK button. Your done.
3 - however, if you get the navigation window
that indicates that PS wants to save it as a PS formatted file. You'll need to either select JPEG from the menu and save (top image) or click on the desktop in the Navigation window (bottom image) and save it to the desktop for importing as a new photo.
This method will let iPhoto know that the photo has been editied and will update the thumbnail file to reflect the edit..
NOTE: With Photoshop Elements the Saving File preferences should be configured as shown:
I also suggest the Maximize PSD File Compatabilty be set to Always. In PSE’s General preference pane set the Color Picker to Apple as shown:
Note 1: screenshots are from PSE 10
Note: to switch between iPhoto and PS or PSE as the editor of choice Control (right)-click on the thumbnail and select either Edit in iPhoto or Edit in External Editor from the contextual menu. If you use iPhoto to edit more than PSE re-select iPhoto in the iPhoto General preference pane. Then iPhoto will be the default editor and you can use the contextual menu to select PSE for your editor when desired.
OT -
Welcome to the Novell Dynamic File Services Forum
Welcome to the new Novell Dynamic File Services forum. You can find information
about this product here: http://www.novell.com/products/dynamic-file-services/
This forum is for discussions about this product.Hi,
I'd like to use the http dynamic streaming functionality in my own video player. So far my investgations have led me to
1) Try to see how the OSMF have implemented the classes (although I read the code I cant get ay to compile so I have abandoned this thread)
2) Looked at earlier tutorials highlighting the use of rtmp dynamic classes (http://www.adobe.com/devnet/flashmediaserver/articles/dynstream_advanced_pt1.html) although this alludes to the newer http features there is no example and what there is uses Flash as an environment
3) Browsed the forums.
We use flex to build our player. Is there an "from scratch" example of how to code a http dynamic player in code.
Thanks, -
How to join 5 different tables using SQL to make it to a flat file structur
I am trying to load five differnt tables into one flat file structure table without cartesian product.
I have five different tables Jobplan, Jobtask(JT), Joblabor(JL), Jobmaterial(JM) and Jpsequence(JS) and the target table as has all the five tables as one table.
The data i have here is something like this.
jobplan = 1record
jobtask = 5 records
joblabor = 2 records
jobmaterial = 1 record
jpsequence = 3 records
The output has to be like this.
JPNUM DESCRIPTION LOCATION JT_JPNUM JT_TASK JL_JPNUM JL_labor JM_JPNUM JM_MATERIAL JS_JPNUM JS_SEQUENCE
1001 Test Jobplan USA NULL NULL NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 10 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 20 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 30 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 40 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 50 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA NULL NULL 1001 Sam NULL NULL NULL NULL
1001 Test Jobplan USA NULL NULL 1001 Mike NULL NULL NULL NULL
1001 Test Jobplan USA NULL NULL NULL NULL 1001 Hammer NULL NULL
1001 Test Jobplan USA NULL NULL NULL NULL NULL NULL 1001 1
1001 Test Jobplan USA NULL NULL NULL NULL NULL NULL 1001 2
1001 Test Jobplan USA NULL NULL NULL NULL NULL NULL 1001 3
Please help me out with this issue.
Thanks,
Siva
Edited by: 931144 on Apr 30, 2012 11:35 AMHope below helps you
CREATE TABLE JOBPLAN
( JPNUM NUMBER,
DESCRIPTION VARCHAR2(100)
INSERT INTO JOBPLAN VALUES(1001,'Test Jobplan');
CREATE TABLE JOBTASK
( LOCATION VARCHAR2(10),
JT_JPNUM NUMBER,
JT_TASK NUMBER
INSERT INTO JOBTASK VALUES('USA',1001,10);
INSERT INTO JOBTASK VALUES('USA',1001,20);
INSERT INTO JOBTASK VALUES('USA',1001,30);
INSERT INTO JOBTASK VALUES('USA',1001,40);
INSERT INTO JOBTASK VALUES('USA',1001,50);
CREATE TABLE JOBLABOR
( JL_JPNUM NUMBER,
JL_LABOR VARCHAR2(10)
INSERT INTO JOBLABOR VALUES(1001,'Sam');
INSERT INTO JOBLABOR VALUES(1001,'Mike');
CREATE TABLE JOBMATERIAL
( JM_JPNUM NUMBER,
JM_MATERIAL VARCHAR2(10)
INSERT INTO JOBMATERIAL VALUES(1001,'Hammer');
CREATE TABLE JOBSEQUENCE
( JS_JPNUM NUMBER,
JS_SEQUENCE NUMBER
INSERT INTO JOBSEQUENCE VALUES(1001,1);
INSERT INTO JOBSEQUENCE VALUES(1001,2);
INSERT INTO JOBSEQUENCE VALUES(1001,3);
SELECT JP.JPNUM AS JPNUM ,
JP.DESCRIPTION AS DESCRIPTION ,
NULL AS LOCATION ,
NULL AS JT_JPNUM ,
NULL AS JT_TASK ,
NULL AS JL_JPNUM ,
NULL AS JL_labor ,
NULL AS JM_JPNUM ,
NULL AS JM_MATERIAL ,
NULL AS JS_JPNUM ,
NULL AS JS_SEQUENCE
FROM JOBPLAN JP
UNION ALL
SELECT JP.JPNUM AS JPNUM ,
JP.DESCRIPTION AS DESCRIPTION ,
JT.LOCATION AS LOCATION ,
JT.JT_JPNUM AS JT_JPNUM ,
JT.JT_TASK AS JT_TASK ,
NULL AS JL_JPNUM ,
NULL AS JL_labor ,
NULL AS JM_JPNUM ,
NULL AS JM_MATERIAL ,
NULL AS JS_JPNUM ,
NULL AS JS_SEQUENCE
FROM JOBPLAN JP, JOBTASK JT
UNION ALL
SELECT JP.JPNUM AS JPNUM ,
JP.DESCRIPTION AS DESCRIPTION ,
NULL AS LOCATION ,
NULL AS JT_JPNUM ,
NULL AS JT_TASK ,
JL.JL_JPNUM AS JL_JPNUM ,
JL.JL_labor AS JL_labor ,
NULL AS JM_JPNUM ,
NULL AS JM_MATERIAL ,
NULL AS JS_JPNUM ,
NULL AS JS_SEQUENCE
FROM JOBPLAN JP, JOBLABOR JL
UNION ALL
SELECT JP.JPNUM AS JPNUM ,
JP.DESCRIPTION AS DESCRIPTION ,
NULL AS LOCATION ,
NULL AS JT_JPNUM ,
NULL AS JT_TASK ,
NULL AS JL_JPNUM ,
NULL AS JL_labor ,
JM.JM_JPNUM AS JM_JPNUM ,
JM.JM_MATERIAL AS JM_MATERIAL ,
NULL AS JS_JPNUM ,
NULL AS JS_SEQUENCE
FROM JOBPLAN JP, JOBMATERIAL JM
UNION ALL
SELECT JP.JPNUM AS JPNUM ,
JP.DESCRIPTION AS DESCRIPTION ,
NULL AS LOCATION ,
NULL AS JT_JPNUM ,
NULL AS JT_TASK ,
NULL AS JL_JPNUM ,
NULL AS JL_labor ,
NULL AS JM_JPNUM ,
NULL AS JM_MATERIAL ,
JS.JS_JPNUM AS JS_JPNUM ,
JS.JS_SEQUENCE AS JS_SEQUENCE
FROM JOBPLAN JP, JOBSEQUENCE JS;
JPNUM DESCRIPTION LOCATION JT_JPNUM JT_TASK JL_JPNUM JL_LABOR JM_JPNUM JM_MATERIA JS_JPNUM JS_SEQUENCE
1001 Test Jobplan NULL NULL NULL NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 10 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 20 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 30 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 40 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan USA 1001 50 NULL NULL NULL NULL NULL NULL
1001 Test Jobplan NULL NULL NULL 1001 Sam NULL NULL NULL NULL
1001 Test Jobplan NULL NULL NULL 1001 Mike NULL NULL NULL NULL
1001 Test Jobplan NULL NULL NULL NULL NULL 1001 Hammer NULL NULL
1001 Test Jobplan NULL NULL NULL NULL NULL NULL NULL 1001 1
1001 Test Jobplan NULL NULL NULL NULL NULL NULL NULL 1001 2
1001 Test Jobplan NULL NULL NULL NULL NULL NULL NULL 1001 3
{code} -
Moving a Folder from the iTunes Folder to My Own File Structure
I'm trying to move a folder from the default iTunes folder to a folder on another drive where I am managing the file structure. This allows me to file music as I would like to on the disk an makes things much easier to find when I need to.
Recently I've noticed that when I do this I lose some of the song meta data such as the image and rating which is kind of annoying. Is there any way to do this and not lose meta data?If the external drive is in NTFS format (which is how most new drives are shipped), the Mac can't write to it. Reformat it with Disk Utility, putting it into Mac OS Extended format.
-
Imported par file to NWDS, jar file missing. Where find par file structure?
Hi All,
We want to implement idle session timeout on the portal. We found a wiki here on the subject. In order to implement the solution we need to make adjuestments to com.sap.portal.navigation.masthead.par.bak. When we import this par file to Netweaver Developer Studio, the libraries and jar files are missing. So obviously it errors then in portal as files are missing.
So I could add them manually but I don't know what the file structure is on this par file so don't where to add what. Does anyone know?
Also when I export the file after manually adding the missing files, will it export all the files or will it behave the same way as the import and leave out the additional files?
If anyone could help, I'd really appreciate it,
Kind regards,
Liz.Hi Bala,
I thought of that just before you responded and it worked!! Thanks for the response..
Regards,
Liz. -
Regarding creation of dynamic deep structure
Hi all,
I have to construct the dynamic deep structure.
for example,
With the help of information about address and t_table, i have to create the deep structure on runtime.
In one case i need to create the deep strcuture with address alone and in another case i need to create the deep structure with address and t_table.
DATA : begin of address ,
address type string ,
end of address.
data : begin of t_tab ,
name(10) type c,
age type i,
address like address,
end of t_tab,
t_table like Table of t_tab.
data: BEGIN OF wa_vig,
details1 like address,
details2 like t_table,
END OF wa_vig.
data: BEGIN OF wa_vig,
details1 like address,
END OF wa_vig.
Please guide me, how should i acheive this with the help of dynamic declarion .
regards
VigneshHi All,
I will be happy if i can acheive this using RTTS concept.
Regards
Vignesh -
Need help understanding Time Capsule file structure and how to get it back
I have the original Time Capsule on which I backup both my Mac Pro and my wife’s Macbook. When Snow Leopard came out, I successfully used the ‘Restore from Time Machine’ feature on my Mac Pro so I know it has worked. However, my wife’s MacBook harddrive died the other day and I was trying to do the ‘Restore from Time Machine’ and all it would find was a backup from April (when I put her in a new larger drive). Time Machine would not find any backup files newer that April. She stated that she had seen the the Time Machine backup notices regularly and as recent as the day the haddrive died (Nov. 23) so I figured that I should have no problem. Here is what I have found in my trouble shooting and what leads to my questions below.
This is the file structure I found: (note that our ID’s are ‘Denise’ and ‘John’)
*Time Capsule* (the drive as listed in my Finder window sidebar under ‘shared’)
>Folder called ‘Time Capsule’ (when logged in as either ‘Denise’ or ‘John’)
>>Denise Sparsebundle
>>>Backup of Denise’s iBook (mounted image)
>>>>Folder called ‘Backups.backupdb’
>>>>>Folder called ‘Denise’s iBook
>>>>>>Single folder with old April backup (not the right files)
>>John Sparsebundle
>>>Backup of John’s Mac Pro (mounted image)
>>>>Folder called ‘Backups.backupdb’
>>>>>Folder called ‘John’s Mac Pro’
>>>>>>Folders containing all my backup files
>Folder Called ‘Denise’ (if logged as ‘Denise’)
>>Denise’s Sparsebundle (a disk image)
>>>Backup of Denise iBook (the mounted image. Name from old machine)
>>>>Backups.Backupdb
>>>>>Denise’s iBook (Contains the backup I need)
>Folder Called ‘John’ (if logged in as ‘John’)
>> (empty)
For some reason, my wife’s backup files are stored within a folder located at the top level of the Time Capsule drive called ‘Denise’, however, mine are within a folder called ‘Time Capsule’ which is at the same level as the ‘Denise’ folder.
For some reason, when trying to use Time Machine to recover, it bypasses the top level ‘Denise’ folder which contains the correct files and goes into the ‘Time Capsule’ folder and finds the outdated backup files.
I would assume that both my backup files and my wife’s should be at the same level of the Time Capsule.
I was eventually able to use Migration Assistant to recover the files after installing a fresh OS and mounting the correct Sparsebundle image.
So, my question, how do I get this fixed so that if I have to recover a drive with Time Capsule, it will find the correct files.
Sorry for the long post and thanks in advance for any help.John Ormsby wrote:
What I was trying to determine is why different backups for one particular machine are located at different file structure levels on my Time Capsule and why the most resent one for that machine is at a different level that for my other machine. Also, what is the correct level that both machines backups should be at.
well John, first can you clarify if you are on 10.4.8 as your profile suggests ? if so, i'm wondering how you can use TM at all because TM was introduced with Leo. if you're not on 10.4.8, please update your profile. if you could, please also indicate details of your machine such as available RAM, processor, etc.
second, what OS is your wife's machine running ?
third, i frankly don't know too much about TM's file structure or if there indeed is such a thing as the correct one. however, i do know that it is best to leave TM to do its thing (as long as it's working). FWIW, though off-topic, you may want to have a look at this read for further information.
last but not least, i see TM backups to my TC only as a part of my backup strategy. the backbone of my strategy is to create bootable clone of my startup disk(s) using e.g. Carbon Copy Cloner on a regular basis. while TM is capable of doing a full system restore, i don't trust it as much as a working clone. i use TM to retrieve a file i accidentally trashed a week ago but you cannot boot from a TM backup if your startup disk goes belly up.
If I have missed this information in either the FAQs or the Troubleshooting article, I would greatly appreciate being pointed to it .
i expect you didn't miss anything and i'm sorry you didn't find any help there. perhaps if Pondini (author of the FAQ and troubleshooting user tips) reads this thread, you will get the necessary advice on the file structure and more besides.
good luck to you ! -
AVCHD File Structure From MAC to PC
Hello,
I know this is sort of off topic but I need the expertise of this forum....
I am the videographer (mac user) for a speaking group and have to film speeches 5-7 minutes in length each week. The problem is how to get the videos to each person without processing on my Mac and uploading to youtube so they can see them.
I am Using a Canon HF10 and wonder if I could pop out the memory card from the camera put it in a card reader attached to a person's PC and transfer the files that way. I know that I would copy the entire file structure to their computer. I assume the PC person would have to have something that could work with the AVCHD files but I don't know what is on a PC computer these days. The other dilemma is that the speakers HAVE ABSOLUTELY NO COMPUTER skills.
Anyone have any ideas how to get the video to a viewable state without going through FCE.
thanks,
AlI agree with Martin and might add that I was in a similar project late last year with several folks who certainly weren't as computer savvy as myself, and YouTube turned out to be the solution. That way I didn't have to deal with the perennial "I couldn't open it" comments you WILL receive. Everyone was able to view YouTube.
You might try uploading to Vimeo - I think you can actually upload .MTS files directly to their site, if you want to just post your raw footage. You could just get the files in the Finder directly from the file structure on the HF10, copy them to your Mac and upload from there. (I haven't tried this, but Vimeo's site says they accept .MTS files, which is what the HF10 videos will be, in the "STREAM" subfolder)
The downside is that their free account maxes out at 500MB of uploads per week. That may be enough for you but I don't know. If you pay, the limit is quite a bit higher.
Maybe you are looking for
-
Problem in Flat File Uploading
Hi everyone During a flat file uploading, I get a message for each record of my SKU (Stock Keeping Unit), for eg for SKU 1017023, I got a message <b>Value '1017023 ' for characteristic PISKU is in external format RRSV 9</b> PISKU is the field
-
Extra Phantom display reported
I've tried a search but I'm not sure how to phrase this problem. I have a 2006 mac pro with 2 x NVIDIA GeForce 7300 GT installed from new. I have always run 3 monitors, a 20" Viewsonic LCD in the middle and 2 x 19" CRT's either side (Samsung on the l
-
EA2 - BUG: Export produces invalid SQL for BINARY_DOUBLE columns
Description: When a table contains a BINARY_DOUBLE column, INSERTs for that column in SQL exported with Export Wizard are invalid. Steps to reproduce: - open SQL Developer, connect to DB - make a table containing a BINARY_DOUBLE column, e.g.: CREATE
-
OPA 10.4 "Inferred Entity Instances", possible to integrate with Siebel?
Hi, In OPA 10.4, I found a new feature "Inferred Entity Instances" using which we can create benefit plans, Opportunities, Service Request,Order lines using only rules. Does it mean we can create an Opportunity/Service Request/Order line by writing t
-
A figure of 1107 is still showing in aperture trash. Urgent please.
In aperture 3 , In the empty trash, in the left side, its still says , I have 1107 files in the trash , there is nothing so see in there, I have cleaned out the mac trash, done a Mac clean. . been told its due to referenced and managed files, in the