Skipping fields during content conversion

is it possible to skip fields during content conversion in file sender adapter?
The case is, the CSV file contains 300+ field and i only need 10. Therefore i have created a datatype with those 10 field.
I was hoping to skip the fields i do not need during content conversion.
If this is not possible i see no other than to create a datatype with 300+ fields by hand. Which will take all day!
Hopefully there is a solution, since i could not find what i needed on the SAP Help or SDN.
Thanks in advance for all your time!

Hi M,
is it possible to skip fields during content conversion in file sender adapter?
No .. this is not possible during FCC.
The case is, the CSV file contains 300+ field and i only need 10. Therefore i have created a datatype with those 10 field.
Hopefully there is a solution, since i could not find what i needed on the SAP Help or SDN.
Well, you can give it a shot. There are freewares available on internet, which can transform the CSV file to XSD. Then you can import that XSD structure in ESR.
The only point is that your source CSV file must be having the header (with all field names on it).
I had done this with another integration tool, no idea how it will work with PI. But it's worth giving it a try.
If this is not gonna happen, then you have no other option but to create complete DT manually or use adapter module (not recommended for such cases).
Regards,
Neetesh

Similar Messages

  • #550 4.4.7 QUEUE.Expired; message expired ## With LastError "A storage transient failure has occurred during content conversion." In submission Queue. (Exchange 2013)

    Greetings,
    We seem to be having a problem with some users who are attempting to send e-mails from within the organisation to an external domain. Not all users are affected, and not all outgoing e-mails have this issue.
    Some e-mails get stuck in the submission queue. This is the error message in Last Error : "A storage transient failure has occurred during content conversion."
    Days later, the internal user who send the message gets a #550 4.4.7 QUEUE.Expired; message expired ## NDR.
    We did have some initial configuration issues, but these were fixed more than a week ago :
    - The external FQDN during EHLO was set to the wrong address, now pointing to the correct one.
    - SPF record was updated with new IP adress.
    Here is some additional information on the issue :
    - Not on any blacklists - checked using dnsbl.info
    - Telnet to remote servers works from exchange server, connections are accepted and can send mail.
    - Outbound SMTP test ran using Microsoft Remote Connectivity Analyser : Passed with both External (Static) and Smarthost IP.
    - This seems to happen only with emails that have an attachment and that are transfered, but only for the affected users. 
    - If content from these e-mails is manually copied over to a new email, email is sent to destination without problem.
    Configuration information :
    - Exchange 2013 running on Windows 2012 Datacenter with all latest updates.
    - Outgoing e-mail is sent via smarthost. Only one outbound transport rule is active.
    - Using internal DNS server.
    - There is only one mailbox database.
    Thank you for taking the time to read this!

    On Wed, 16 Jan 2013 15:31:14 +0000, Ipigi wrote:
    >Sorry, I often get some terms mixed up when I explain things as our users use the French version of outlook.
    >
    >E-mails are not transferred, but forwarded manually from their outlook. Message format in outlook is set to HTML and not Rich Text when they foward the e-mail.
    Do they forward the message as an attachment?
    >When forwarded internally, this is in the internet headers :
    >
    >Content-Type: application/ms-tnef; name="winmail.dat" Content-Transfer-Encoding: binary
    Within your organization I'm prety sure that messages will use TNEF.
    What does the message contain at the external recipient's side?
    >It really seems to me that Exchange is not converting RTF to Plain Text. The first link you provided states in it's final paragraph that Exchange should be doing this conversion.
    If you can, try creating a mail-enabed Contact for one of the external
    recipients and set the message format on that.
    >If I disabled TNEF as that link suggests, offending messages will get stuck in the submission queue again.
    >
    >I thank you for your help so far. This is not an issue I've had with any previous installations/migrations of Exchange that I have done.
    >
    >Please let me know if you need any additional information.
    Have you tried UNsetting TNEF on the remote domain?
    Set-RemoteDomain Default -TNEFEnabled $null
    That should leave it up to the client to determine the format. It's
    probably not what you're after, but see it makes a difference in the
    format.
    Rich Matheisen
    MCSE+I, Exchange MVP
    --- Rich Matheisen MCSE+I, Exchange MVP

  • A storage transient failure has occurred during content conversion

    Hi,
    I have Exchange 2010 SP3 installed on Windows server 2012.
    External invites are not processing, and return with Source Contex "StorageTransientFailureDuringContentConversion".
    This is cause a great problem in our enviroment...
    Some suggests?
    Att
    Paulo Nunes

    What version of Exchange 2010 SP3 do you have?  RU4 for Exchange 2010 SP3 has been released -
    KB2905616 .
    RU3 is supposed to fix this particular issue, RU4 includes that fix as well. 
    Previous support forum question on this -
    http://social.technet.microsoft.com/Forums/en-US/fd7ef80e-f80b-47ed-883b-a34511c6233c/a-storage-transient-failure-has-occurred-during-content-conversion?forum=exchangesvrsecuremessaginglegacy
    JAUCG - Please remeber to mark replies as helpful if they were or as answered if I provided a solution.

  • Problem during Content Conversion

    Hi People,
    I have a content conversion scenario, where my input is a flat file. There are certain values which are blank in my fields. However, in my resultant XML, I need to view these blank spaces enclosed within the respective XML tags.
    But, I do not get these blank spaces, and I just get an empty close tag like this
    <Batch_no />
    instead of <Batch_no>    </Batch_no>
    What do I do to get these spaces in my XML?
    regards,
    Prashanth

    Hi Prash,
    If the blanks are already in your flat file, you can use the conversion parameter <b><RecordSetName>.fieldContentFormatting</b> and set it to "<b>nothing</b>". Note that the blanks won't be shown in sxmb_moni (they are removed by the displaying tool), but nevertheless they are there!
    For details please check note 821267.
    Best regards
    Joachim

  • Poulate the header levels fields in Content Conversion in receiver FAdapter

    Hi All,
    I have req in that there is table stucture , this table strcture I want to populate the header level and item level also at receiver side. I am using Content Conversion at receiver side.
    the data is like this
    at header level
    Country      City       Name   Age
      India           bom       kk        19
      Sweden    Stock     GG       20
    for this type file How can I create the datatype  and In wht way I  will use content conversion for this.wht paramters I have to give in in Content Conversion.......
    Regards

    HI Sudip,
    the field labels I want in excel file at top or header of every coloum are)
    Country City Name Age
    below these fields only the value of these fields will come like this
    India bom kk 19
    Sweden Stock GG 20
    for this u mean to say I have to declare only header
    Create Data Type with sub-element ITEMS(just an example)
    Then Country,Name,Age or whatever field you want to add in header of output file.
    Once data type is ready now time to configure Receiver CC:
    Use following Content Conversion Parameters:
    Recordset Structure: ITEMS
    ITEMS.addHeaderLine: 1
    ITEMS.fieldSeparator: (space)
    ITEMS.endSeparator: 'nl'
    and my data will come like this in file
    Country    City            Name    Age
    India          Banglore   jjjjj          24
    England     london      kkkk      88
    Can u plz cofirm for this...........
    Regards

  • Error during Content conversion

    Hi,
    The scenario is file to file. Inbound and async. The encrypted data is coming from third party. we are decrypting it and doing the content conversion. Earlier there was only 1 node. now the requirement is that they will be sending the header also. now there are 2 nodes with children. earlier there was only 1 node with children and the data was processing successfully. now when I have added header the fields of the header are not getting the value. Like header has the fields like Number_of_records, sbm_system_date etc.. but the fields of the header are getting replaced by the fields of the other node. I have attached the screenshots for the same. let me know if you need any more information. The payload is like below
    <?xml version="1.0" encoding="utf-8" ?>
    - <ns:MT_BillingData_SBM xmlns:ns="http://pspcl.com/xi/SBM/IF0043_BillingData_100"> 
    - <BillingData_SBM> 
    <SUB_DIVISION_CODE>11</SUB_DIVISION_CODE> 
    <MRU>16/08/2014</MRU> 
    <Connected_Pole_INI_Number>11:26:09</Connected_Pole_INI_Number> 
    <NEIGHBOR_METER_NO>I0027G14_14082014112726.enc</NEIGHBOR_METER_NO> 
    <STREET_NAME>1</STREET_NAME> 
    <INSTALLATION>I0027G14</INSTALLATION> 
    </BillingData_SBM>
    - <BillingData_SBM> 
    <SUB_DIVISION_CODE>4272</SUB_DIVISION_CODE> 
    <MRU>MR60BM</MRU> 
    <Connected_Pole_INI_Number>J-DP06-FL12-031-004_G-1-P07</Connected_Pole_INI_Number> 
    <NEIGHBOR_METER_NO /> 
    <STREET_NAME /> 
    <INSTALLATION>5000057583</INSTALLATION> 
    <MR_DOC_NO>200000001829984</MR_DOC_NO> 
    <SCHEDULED_MRDATE>25-04-2014</SCHEDULED_MRDATE> 
    <METER_NUMBER>10089749</METER_NUMBER> 
    <MANUFACTURER_SR_NO>3376181</MANUFACTURER_SR_NO> 
    <MANUFACTURER_NAME>Capital Meters</MANUFACTURER_NAME> 
    <CONTRACT_ACCOUNT_NUMBER>100106763</CONTRACT_ACCOUNT_NUMBER> 
    <CONSUMPTION_KWH>16430.000</CONSUMPTION_KWH> 
    <CONSUMPTION_KWAH>18256.000</CONSUMPTION_KWAH> 

    Hi,
    Thanks for the reply. I have used the parameters in FCC which I have posted above. are they correct?
    I am not getting the desired result. There may be issue with the input file/Parameters.
    I am getting the below result in which the header values are not coming.
    <?xml version="1.0" encoding="utf-8" ?>
    - <ns:MT_BillingData_SBM xmlns:ns="http://pspcl.com/xi/SBM/IF0043_BillingData_100"> 
    - <BillingData_SBM> 
    <SUB_DIVISION_CODE>4272</SUB_DIVISION_CODE>  
    <MRU>MR60BM</MRU>  
    <Connected_Pole_INI_Number>J-DP06-FL12-031-004_G-1-P07</Connected_Pole_INI_Number>  
    <NEIGHBOR_METER_NO />  
    <STREET_NAME />  
    <INSTALLATION>5000057583</INSTALLATION>  
    <MR_DOC_NO>200000001829984</MR_DOC_NO>  
    <SCHEDULED_MRDATE>25-04-2014</SCHEDULED_MRDATE>  
    <METER_NUMBER>10089749</METER_NUMBER>  
    <MANUFACTURER_SR_NO>3376181</MANUFACTURER_SR_NO>  
    <MANUFACTURER_NAME>Capital Meters</MANUFACTURER_NAME>  
    <CONTRACT_ACCOUNT_NUMBER>100106763</CONTRACT_ACCOUNT_NUMBER>  
    <CONSUMPTION_KWH>16430.000</CONSUMPTION_KWH>  
    <CONSUMPTION_KWAH>18256.000</CONSUMPTION_KWAH>  
    <CONSUMPTION_KVA>0.000</CONSUMPTION_KVA>  
    <CUR_METER_READING_KWH>0.000</CUR_METER_READING_KWH>  
    <CUR_METER_READING_KVA>0.000</CUR_METER_READING_KVA>  
    <CUR_METER_READING_KVAH>0.000</CUR_METER_READING_KVAH>  
    <CUR_METER_READING_DATE>13-08-2014</CUR_METER_READING_DATE>  
    <CUR_METER_READING_TIME>182427</CUR_METER_READING_TIME>  
    <CUR_METER_READER_NOTE>D</CUR_METER_READER_NOTE>  
    <PRV_METER_READING_KWH>19170.000</PRV_METER_READING_KWH>  
    <PRV_METER_READING_KVA>0.000</PRV_METER_READING_KVA>  
    <PRV_METER_READING_KWAH>0.000</PRV_METER_READING_KWAH>  
    <PRV_METER_READING_DATE>30-06-2011</PRV_METER_READING_DATE>  
    <PRV_METER_READING_TIME />  
    <PRV_METER_READER_NOTE />  
    <OCTROI_FLAG>Y</OCTROI_FLAG>  
    <SOP>95465.00</SOP>  
    <ED>7636.92</ED>  
    <OCTROI>1643.00</OCTROI>  
    <DSSF>4773.08</DSSF>  
    <SURCHARGE_LEIVED>0.00</SURCHARGE_LEIVED>  
    <SERVICE_RENT>0.00</SERVICE_RENT>  
    <METER_RENT>292.00</METER_RENT>  
    <SERVICE_CHARGE>0.00</SERVICE_CHARGE>  
    <MONTHLY_MIN_CHARGES>11638.20</MONTHLY_MIN_CHARGES>  
    <PF_SURCHARGE>0.00</PF_SURCHARGE>  
    <PF_INCENTIVE>0.00</PF_INCENTIVE>  
    <DEMAND_CHARGES>0.00</DEMAND_CHARGES>  
    <FIXEDCHARGES>0.00</FIXEDCHARGES>  
    <VOLTAGE_SURCHARGE>0.00</VOLTAGE_SURCHARGE>  
    <PEAKLOAD_EXEMPTION_CHARGES>0.00</PEAKLOAD_EXEMPTION_CHARGES>  
    <SUNDRY_CHARGES>0.00</SUNDRY_CHARGES>  
    <MISCELLANEOUS_CHARGES>0.00</MISCELLANEOUS_CHARGES>  
    <FUEL_ADJUSTMENT>0.00</FUEL_ADJUSTMENT>  
    <BILL_NUMBER>I002714H131824001</BILL_NUMBER>  
    <NO_OF_DAYS_BILLED>1141</NO_OF_DAYS_BILLED>  
    <BILL_CYCLE>2</BILL_CYCLE>  
    <BILL_DATE>28-08-2014</BILL_DATE>  
    <DUE_DATE>26-08-2014</DUE_DATE>  
    <BILL_TYPE>2</BILL_TYPE>  
    <PAYMENT_AMOUNT>0.00</PAYMENT_AMOUNT>  
    <PAYMENT_MODE />  
    <CHECK_NO />  
    <BANK_NAME />  
    <PAYMENT_ID />  
    <IFSC_CODE />  
    <MICRCODE />  
    <PAYMENT_DATE />  
    <PAYMENT_REMARK />  
    <TOT_BILLAMOUNT>109970.00</TOT_BILLAMOUNT>  
    <SBM_NUMBER>I0027G14</SBM_NUMBER>  
    <METER_READER_NAME>USER1</METER_READER_NAME>  
    <INHOUSE_OUTSOURCED_SBM>PSPCL</INHOUSE_OUTSOURCED_SBM>  
    <TransfromerCode />  
    <MCB_RENT>156.00</MCB_RENT>  
    <LPSC>9591.00</LPSC>  
    <TOT_AMT_DUE_DATE>119561.00</TOT_AMT_DUE_DATE>  
    <TOT_SOP_ED_OCT>109518.00</TOT_SOP_ED_OCT>  
    <KeyField>P</KeyField>  
    </BillingData_SBM>
    - <BillingData_SBM> 
    <SUB_DIVISION_CODE>4272</SUB_DIVISION_CODE>  
    <MRU>MR60BM</MRU>  
    <Connected_Pole_INI_Number>J-DP06-FL12-031-004_G-1-P06</Connected_Pole_INI_Number>  
    <NEIGHBOR_METER_NO />  
    <STREET_NAME />  
    <INSTALLATION>5000057703</INSTALLATION>  
    <MR_DOC_NO>200000001830038</MR_DOC_NO>  
    <SCHEDULED_MRDATE>25-04-2014</SCHEDULED_MRDATE>  
    <METER_NUMBER>10089869</METER_NUMBER>  
    <MANUFACTURER_SR_NO>8</MANUFACTURER_SR_NO>  
    <MANUFACTURER_NAME>Saraf Industries</MANUFACTURER_NAME>  
    <CONTRACT_ACCOUNT_NUMBER>100106883</CONTRACT_ACCOUNT_NUMBER>  
    <CONSUMPTION_KWH>5429.000</CONSUMPTION_KWH>  
    <CONSUMPTION_KWAH>6032.000</CONSUMPTION_KWAH>  
    <CONSUMPTION_KVA>0.000</CONSUMPTION_KVA> 

  • Last field not shown in MONI if its empty during content conversion-urgent

    Hi All,
            I have my input structure like
       <RECORDSET>
          <EMPLOYEE>
             <FIRSTNAME>
             <LASTNAME>
             <PHONENUMBER>
          </EMPLOYEE>
       </RECORDSET>
           I have given my sender Conversion parameters like
       Employee.fieldSeparator=*
       Employee.endSeparator='nl'
       Employee.fieldNames=FIRSTNAME,LASTNAME,PHONENUMBER
    It is working fine.But if the value is empty for any of the field it is showing up with empty tag in MONI but for the Last field i.e.,PHONENUMBER if the value is empty the tag is not being shown in MONI. Please help me in this issue.It is very urgent.

    Hi Dinakar,
    I haven't tried but u can check these parameters and give it a try
    <b>NameA.missingLastfields</b>
    If the inbound structure has less fields than specified in the configuration then the XML outbound structure is created as follows:
    &#9675;       ignore
    Outbound structure only contains the fields in the inbound structure
    &#9675;       add
    Outbound structure contains all fields from the configuration; the fields missing in the inbound structure are empty.
    &#9675;       error
    Conversion is terminated due to the incomplete inbound structure. An error message is displayed.
    &#9679; <b>     NameA.additionalLastFields</b>
    If the inbound structure has more fields than specified in the configuration then the XML outbound structure is created as follows:
    &#9675;       ignore
    Outbound structure only contains the fields in the inbound structure
    &#9675;       error
    Conversion is terminated due to the incomplete inbound structure. An error message is displayed.
    The default value is ignore. If you have defined the NameA.fieldFixedLengths parameter, the default value is error.
    <b>NameA.lastFieldsOptional</b> (obsolete)
    You use this parameter to specify whether the last fields can be omitted (YES) or not (NO) in a comma-separated structure.
    If you do not make an entry, the default value is NO.
    Check for details:
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/frameset.htm
    Sachin

  • Sender file adapter dropping last column during content conversion

    I am trying to process a flat file with pipe delimited data, but when the last column of the file is empty the file adapter ignores the column, causing issues with the subsequent mapping program.  For example, if the file contains the following data..
    1||three|
    ... the converted content produced is...
       <column1>1<column1>
       <column2/>
       <column3>three</column3>
    My mapping is expecting that <column4/> also be delivered in order to function properly.  The fields are all defined in record.fieldNames, and if there is any data present following the third pipe it is assigned correctly to the column4 element.  I have also experimented with setting missingLastFields to "add", and tried explicitly specifying endFieldSeparator to 'nl' with no success.
    Is there anyway to control this behavior in the communication channel, or is my only option to to account for it within the mapping by using mapWithDefault function for every field that appears at the end of a record?

    Nataliya,
    Ensuring that the element is populated during the mapping appears to be the only way to account for this.  Therefore, whenever mapping the last column of a record set, I just made sure to use the MapWithDefault function in case the last field of the record is empty.  It's a little extra manual effort, but it appears to be working fine so far.  I was hoping for a better answer myself.

  • Skipped tables during UNICODE conversion

    Hello everybody.
    We are making a unicode conversion in a 701 Ehp4 over iSeries.
    The export has finished succesfully but we have detected that some tables have been skipped. In the Export_monitor.log there are some comments like this one:
    INFO: 2010-04-15 01:30:56 com.sap.inst.migmon.LoadTask processPackage
    Unloading of 'AGKO' export package from database will be skipped.
    Task file '/usr/sap/SAPinst/NW04S/EXP/AGKO.TSK' is empty and contains no tasks.
    Does anyone know why have these tables been skipped?
    Do we have to repeat all the process?
    Thank you in advanced.
    Best regards.
    Rubén Garcí

    Hello
    The problem was related with Inplace method
    Regards
    Ruben

  • Key field in File Content Conversion

    Is Keyfield must be unique in FCC ?

    Hi,
    Check this to get more Clarity abt the Key Field Usage
    Content Conversion ( The Key Field Problem )
    Regards
    Seshagiri

  • Ignore fields in File Content Conversion

    Hi all,
    I need to read first 9 fields in Content conversion from text file and ignore all the others.The problem is I don't know exactly the number of fields(it varies from file to file).
    I try to use the following in my content conversion:
    item.fieldSeparator = ;
    item.fieldNames = "Names of item fields"
    item.lastFieldsOptional = NO
    ignoreRecordSetName = true
    Now it works with files with 9 fields,but doesn't with more than 9...
    I tried with item.lastFieldsOptional = YES as well but it didn't help...
    Anyone any suggestions?
    Thanks in advance

    The problem is I don't know exactly the number of fields(it varies from file to file).
    >>>> then how do you write your FCC ? because the mandatory parameters fieldFixedLengths or fieldSeparator and the fieldNames will expect a defined set of values and not dynamic.maybe you can read the file record row kind and then  extract the fields u need.
    item.lastFieldsOptional
    >>>
    this is to ignore only the last field.
    <i>xml.lastFieldsOptional=YES|NO
    This parameter specifies whether the last fields can be omitted (YES) or not (NO) in a CSV structure. If you do not make a specification, the default value is NO.</i>

  • File Content Conversion

    Hi All...
    I am facing a Problem with File Content Conversion in sender File Adapter .
    In my scenario file contains 3 rows (nodes ) of Data but actually the structure of the source is with 4 nodes ,if i do the File Content conversion for 4 nodes i am getting an error
    stating 3 rd row is missing...
    Hear i have mentioned Keyfiled value for Each row ...for Each row like ROU,HDR,KLP,CTR
    In the File KLP row is missing ...how can i handel that .. ?
    thanks in Advance
    shakif

    if ur file will have only those 3 key field values Y do u add the 4th key field (i mean the extra row) in the FCC. if u skip that in content conversion what happens?
    in teh source structure for this extra node change the occurance to 0...

  • Content Conversion issue for header record

    Hi,
    We have a very urgent question on an issue here with one of our XI objects. 
    This is an inbound interface from an external system into R/3 & BW.  The inbound file has a header record (with about 8 fields) and detail records (about 900 fields per detail record). Data going into R/3 & BW don't have header records and everything goes in as detail records. One field from the header of this source file should be passed to the target structure at the detail level. Also, we are NOT using BPM.
    Can someone help us how we could define the file content conversion parameters for File adapter.
    Thanks in advance ......
    Prashant

    I'm so sorry, I wasn't subscribed to this thread and I didn't realize there were responses.
    If you have a message type made up of a Header with 1 occurence and Detail with 1 to unbounded occurunces, you'd want to do the following in content conversion:
    Document Name - your message type
    Document Namespac - your message type namespace
    Recordset Structure - Header,1,Detail,*
    Recordset Structure - Ascending
    Then you'll need to set some of the parameters, depending on the layout of your incoming file. 
    As for the problem of having hundreds of fields, I'm less sure about that.
    Would it be possible to break your detail data type down into smaller data types.  Each with fewer fields.  You'd still have to maintain every field in content conversion, but at least they'd be in seperate parameters, instead of all 900 in one tiny box.
    Here's a very rough example of what I mean:
    If you have 900 fields, instead of making 1 data type of detail, you could make 9 data types, Detail1, Detail2, Detail3, Detail4, Detail5, Detail6, Detail7,Detail8, Detail 9, each with 100 fields in them (or more with even less fields).
    Setting things up the file content conversion would be more complex in this scenario, so it might be a toss up if it's worth it to break it up this way or not if it meant configuring quite a few more parameters.
    For example,
    You'd have to declare your recordset structure like Header,1,Detail1,,Detail2,,Detail3,* etc, and you'd have to make sure to set the .endSeparator to '0' for all of the first 8 details, so it would recognize that they were all on one line.
    I hope this helps a little bit.

  • MIME content conversion failed error while processing "550 5.6.0" NDR using EWS API

    While trying to process journal report having "550 5.6.0" NDR with the following content using EWS API
    *Delivery has failed to these recipients or groups:
    [email protected] ([email protected])
    The email system had a problem processing this message. It won't try to deliver this message again.
    [email protected] ([email protected])
    The email system had a problem processing this message. It won't try to deliver this message again.
    [email protected] ([email protected])
    The email system had a problem processing this message. It won't try to deliver this message again.
    Diagnostic information for administrators:
    Generating server: ALMPR02MB001.namprd05.prod.outlook.com
    [email protected]
    Remote Server returned '550 5.6.0 M2MCVT.StorageError; storage error in content conversion'
    [email protected]
    Remote Server returned '550 5.6.0 M2MCVT.StorageError; storage error in content conversion'
    [email protected]
    Remote Server returned '550 5.6.0 M2MCVT.StorageError; storage error in content conversion'
    Original message headers:
    Received: from ALMPR02MB001.namprd05.prod.outlook.com ((11.255.110.102)) by
    ALMPR02MB001.namprd05.prod.outlook.com ((11.255.110.102)) with
    ShadowRedundancy id 15.0.851.11; Fri, 24 Jan 2014 12:20:42 +0000
    Received: from AN2PR05MB011.namprd05.prod.outlook.com (10.255.202.146) by
    ALMPR02MB001.namprd05.prod.outlook.com (11.255.110.102) with Microsoft SMTP
    Server (TLS) id 15.0.851.11; Wed, 22 Jan 2014 19:25:20 +0000
    Received: from AN1PR05MB018.namprd05.prod.outlook.com ([159.254.10.28]) by
    AN1PR05MB018.namprd05.prod.outlook.com ([159.254.10.28]) with mapi id
    15.00.0851.011; Wed, 22 Jan 2014 19:25:19 +0000
    Content-Type: application/ms-tnef; name="winmail.dat"
    Content-Transfer-Encoding: binary
    From: "Aron,Shakton"
    To: "[email protected]" ,
    "[email protected]" , "[email protected]"
    Subject: Updated: Drive # 3
    Thread-Topic: Updated: Drive # 3
    Thread-Index: AQHPF6evINDh6QBmQ0OJyeaK0OyWzQ==
    Date: Wed, 22 Jan 2014 19:25:18 +0000
    Message-ID: <[email protected]ok.com>
    Accept-Language: en-US
    Content-Language: en-US
    X-MS-Has-Attach: yes
    X-MS-TNEF-Correlator: <[email protected]ok.com>
    MIME-Version: 1.0
    X-Originating-IP: [::]
    Return-Path: [email protected]
    X-Forefront-PRVS: 01018CB5B3
    X-Forefront-Antispam-Report:
    SFV:NSPM;SFS:(10019001)(6009001)(199002)(189002)(377454003)(2656002)(81816001)(81686001)(54316002)(49866001)(63696002)(65816001)(16799955002)(74876001)(47976001)(77982001)(81342001)(79102001)(94316002)(76576001)(56776001)(47736001)(50986001)(85852003)(54356001)(77096001)(74316001)(53806001)(69226001)(80976001)(4396001)(51856001)(83322001)(93136001)(85306002)(46102001)(19580395003)(74662001)(15975445006)(74706001)(15202345003)(76786001)(59766001)(83072002)(81542001)(76176001)(76796001)(87936001)(87266001)(92566001)(2201001)(47446002)(93516002)(33646001)(90146001)(31966008)(56816005)(74366001)(86362001)(24736002)(3826001);DIR:OUT;SFP:1102;SCL:1;SRVR:ALMPR02MB001;H:AN2PR05MB011.namprd05.prod.outlook.com;CLIP:::;FPR:;RD:InfoNoRecords;A:0;MX:1;LANG:en;*
    I am getting the following error.
    ERROR Message: MIME content conversion failed.
    Stack Trace : at Microsoft.Exchange.WebServices.Data.ServiceResponse.InternalThrowIfNecessary()
    at Microsoft.Exchange.WebServices.Data.ExchangeService.InternalGetAttachments(IEnumerable`1 attachments, Nullable`1 bodyType, IEnumerable`1 additionalProperties, ServiceErrorHandling errorHandling)
    at Microsoft.Exchange.WebServices.Data.ExchangeService.GetAttachment(Attachment attachment, Nullable`1 bodyType, IEnumerable`1 additionalProperties)
    Has anyone faced this issue? how s/he got past this?
    Regards
    Call
    Send SMS
    Add to Skype
    You'll need Skype CreditFree via Skype

    Mokchhya-
    I responded to your
    StackOverflow post as well, but I'll respond here as well.
    Are you using Exchange Server 2010 SP3 RU2? If not, that might fix the issue. Another poster ran into a similar error and they were also sending an email with an attachment:
    http://social.technet.microsoft.com/Forums/en-US/fd7ef80e-f80b-47ed-883b-a34511c6233c/a-storage-transient-failure-has-occurred-during-content-conversion?forum=exchangesvrsecuremessaginglegacy.
    The support page related to the fix is here:
    http://support.microsoft.com/kb/2863310.
    -Mimi

  • Skip field in file content conversion (file adapter)

    hy guys,
    I have a log file and want to convert it into xml. that works fine.
    the problem is: can I skip some fields that I do not need in the xml with file content conversion
    the fields I want to skip are the same in every record of the log file.
    thanks.

    Ralf,
    You can only ignore lines in the begiining of the file using Document Offset option.
    If you have the unwanted field in the end of the record and you are in SPS12 (PI 7.0), you can use the option additionalLasstFields in FCC parameters. have a look into the blog /people/sukumar.natarajan/blog/2007/06/12/content-conversion-in-sender-file-adapter--2-new-useful-parameters
    Easier, is to ignore those fields in mapping.
    Also, you can try to remove those fields using a shell script which can be called from the File adapter.
    Regards,
    Jai Shankar

Maybe you are looking for

  • HELP!!MacBook Pro and iPhone 5 won't sync via iCloud!

    I've just purchased and set up my first Mac. I have a MacBook Pro, and use and iPhone 5. I've set up iCloud on both in the settings / system preferences to sync contacts, calenders, notes etc... but for the love of trying and following any support I

  • Computer crashed and want to move my ipod library to diff. computer

    my computer crashed and i would like to move my music from my ipod to my new computer. i tried transferring it, but when i go to the music folder, the itunes folder and itunes library folders are empty. not sure what i'm doing wrong... my itunes page

  • User-defined alert based on Top Activity screen?

    Does anyone know the query used to populate the Top Activity screen in Grid Control? I want to make a user-defined alert that will page me if the Top Activity is over a certain amount.

  • DM - Meter Reading Workflow functionality

    Hi all, I have a customer's requirement that the estimated MR should be "corrected" automatically in case the next actual MR arrives and it is less then the estimation done before. Can anybody help me via the customizing and the development (if any r

  • Aperture, MobileMe and Syncing

    I use MobileMe Galleries via Aperture a lot. Occasionally however, I will upload a gallery only to discover that I left out a photo or two. I will add the new photo to the gallery in Aperture and click the sync button, but it will start reloading the