Convert custom outbound idoc to CSV file

Hi
I need to send generate a flat file that will be sent to a non SAP partner. A requirement is that the messages go via  the IDOC system. This makes it possible to resend them if necessary.
So there are 2 stages
1) create IDOC. I have done this
2) read IDOC and convert to flat file
stage 1
=====
I have defined a custom IDOC, partner profile etc and developed a Function Module to create the outbound IDOCs The FM is triggered by a message in the NAST table. . This works OK.
So I now have custom outbound IDOCs in the DB with status 30 (ready for dispatch) (WE05).
Stage 2
======
Now I want to convert these custom IDOCs into simple flat files (CSV like).
How do I go about making this happen? I guess I need to develop another Function Module. But I dont know how to tie this FM into the system. My questions are:
How will the FM be triggered
what is the interface of the FM
will I need to configure a 2nd partner profile for stage 2 (which is fine)
Peter
Note1 - there is no EDI converter available so must solve this in ABAP
Note2 - I do not want to create the flat file at the same time as I create the IDOC. This would defeat the purpose of going via IDOCs

No, EDI is only one method of transferring data.  There's no need for an EDI subsystem in this case.
To the OP, since it seems the you have decided that you must use IDOC processing in this case, I would go ahead and use a file port (or an XML port).  You will drop an IDOC formatted flat file into the specified location (for each IDOC, based on your logical file definition).  For you, this will merely be a temporary holding location.  You can then use a simple program to read the file, re-format it, send it, then move the original file to a 'processed' location.  Of course, you could also just throw away the generated files and read the IDOCs directly with the standard functions available and (re)generate/send the files as required. 
It's often done this way with 3rd party EDI providers, where the company in question is required to use EDI by their customers but do not have the required systems in place.

Similar Messages

  • Steps for creating a custom outbound idoc

    Hi,
            Can u please each and every step wise for the creation of the custom outbound idoc.
    Thanks
    Kiran Prasad.

    friend before posting ur query, try to utilize the search tab in the forum, u can find the answer for ur query if not continue with postings.
    Follow this link for step by step tutorial with screen shots
    http://www.****************/Tutorials/ALE/ALEMainPage.htm
    1. Create an IDOC Type.
    The next step is to create an IDOC type by associating the extension type that you created with the Basic IDOC type. This is a simple process:
    u2022 From transaction WE30 or WEDI go to Development -> IDOC Types.
    u2022 Enter ZDEBMASZ for Object Name.
    u2022 Click on IDOC Type.
    u2022 Click on Create.
    u2022 Enter DEBMAS02 for Basic IDOC type.
    u2022 Enter ZDEBMASX for extension type.
    u2022 Enter a description.
    u2022 Enter.
    u2022 You will see a display of the composite IDOC type with all segments, including Z1SADRX (see Figure 3).
    It is possible to associate only one extension type with a Basic IDOC type for a given IDOC type. However, you can have multiple new segments in an extension type.
    2. Link IDOC Type to Message Type.
    The next step is to link the new IDOC type to its corresponding message type. This is important, because this relationship is referenced in the partner profile parameters where you specify the message type and IDOC type to be used for that particular representative system. To link the message type:
    u2022 Use transaction WE82, or from WE30, go to Environment -> IDOC Type / Message Type, or from WEDI go to Development -> IDOC Type -> Environment Î IDOC Type / Message Type.
    u2022 Click on Display <-> Change.
    u2022 Click on New Entries.
    u2022 Enter DEBMAS for message type.
    u2022 Enter DEBMAS02 for Basic IDOC type.
    u2022 Enter ZDEBMASX for extension type.
    u2022 Enter your SAP R/3 release number for Release.
    u2022 Save.
    This data is stored on the EDIMSG table and is accessed by several ALE processes to relate the message type to the IDOC type.
    3. Check the IDOC Type.
    Before checking the IDOC type for consistency, it is important to perform another step that releases the extension type to the IDOC type:
    u2022 From WEDI go to Development -> IDOC Types -> Extras -> Release Type, or from transaction WE30 go to Extras -> Release Type.
    u2022 For the Object Name ZDEBMASX and radio button Extension Type, click Yes.
    u2022 The extension type has now been "released."
    You canât edit the extension type once itâs released. To cancel the release for further editing or deactivation, go to WE30 Î Extras Î Cancel release. The final step in the IDOC extension process is checking the validity of the IDOC type:
    u2022 From transaction WE30 or WEDI go to Development -> IDOC types.
    u2022 Enter ZDEBMASX for Object name.
    u2022 Click on Extension Type.
    u2022 From the Development Object menu select Check.
    u2022 Repeat the operation for IDOC type ZDEBMASZ.

  • Need to create new users in Office 365 with custom attributes from a csv file

    I am exporting users from an active directory environment and then deleting them from AD. They are Alumni and will no longer be in AD.
    I have a csv file with the following fields that I need to use to create new Alumni email boxes in Office 365 for. I need the CustomAttributes because my Dynamic Distribution Groups use them. I am fairly new to PowerShell and have been unable to get this
    to work. I suspect I may have to split it into two parts, but am not sure how to proceed. Any assistance would be appreciated. I was directed here from the Office 365 community.
    Import-Csv -Path c:\CSVfiles\CreateAlumni.csv | ForEach-Object {
       New-MsolUser -FirstName $_.FirstName -LastName $_.LastName
       -UserPrincipalName $_.UserPrincipalName
       -DisplayName "$($_.FirstName) $($_.LastName)"
       -Password $_.Password
       -CustomAttribute1 $_.CustomAttribute1
       -CustomAttribute3 $_.CustomAttribute3
       -CustomAttribute10 $_.CustomAttribute10
       -CustomAttribute11 $_.CustomAttribute11
       -CustomAttribute12 $_.CustomAttribute12
       -LicenseAssignment 'domaincom:EXCHANGESTANDARD_ALUMNI'
       -UsageLocation US

    Ok, it wasn't stopping after 2 iterations. What I was seeing was 2 failures. The first was the Get-Mailbox command and the second was when it tried to assign attributes. For some reason it is not looping when it fails. It just goes on and tries to assign
    the Custom Attributes. I added writes in to tell me what was happening.
    ### Check if mailbox is provisioned yet
    Write-Host "Checking if mailbox is provisioned yet..." -foregroundcolor yellow
    $found = $false
    $count = 0
    Do {
    try {
    Get-Mailbox -Identity $_.UserName -ErrorAction Stop
    $found = $true
    Write-Output 'Mailbox found. Details:'
    Get-Mailbox -Identity $_.UserName
    } catch {
    Write-Output 'Sleeping'
    $count++
    Start-Sleep -Seconds 5
    If ($count -ge 12) {
    Write-Output 'Mailbox not found. Quitting.'
    $found = $true
    } Until ($found)
    Write-Host "Adding Custom Attributes to User" -foregroundcolor yellow
    Set-Mailbox -Identity $_.UserName -CustomAttribute1 $_.CustomAttribute1 -CustomAttribute3 $_.CustomAttribute3 -CustomAttribute10 $_.CustomAttribute10 -CustomAttribute11 $_.CustomAttribute11 -CustomAttribute12 $_.CustomAttribute12
    Write-Output "User has been Provisioned in Office 365!" -foregroundcolor yellow
    Checking if mailbox is provisioned yet...
    The operation couldn't be performed because object 'Joe.Cool2003' couldn't be found on 'CO1PR07A002DC01.NAMPR07A002.prod.outlook.com'.
        + CategoryInfo         
    : NotSpecified: (:) [Get-Mailbox], ManagementObjectNotFoundException
        + FullyQualifiedErrorId : [Server=CO1PR07MB125,RequestId=e1aabda1-01e4-4f68-984e-e20be0975242,TimeStamp=5/22/2014 4:23:59 AM] [FailureCategory=Cmdlet-ManagementObj
       ectNotFoundException] 2788FB48,Microsoft.Exchange.Management.RecipientTasks.GetMailbox
        + PSComputerName        : pod51038psh.outlook.com
    Mailbox found. Details:
    The operation couldn't be performed because object 'Joe.Cool2003' couldn't be found on 'CO1PR07A002DC01.NAMPR07A002.prod.outlook.com'.
        + CategoryInfo         
    : NotSpecified: (:) [Get-Mailbox], ManagementObjectNotFoundException
        + FullyQualifiedErrorId : [Server=CO1PR07MB125,RequestId=16a8a2bc-333a-455c-8504-e0b99c44c334,TimeStamp=5/22/2014 4:24:00 AM] [FailureCategory=Cmdlet-ManagementObj
       ectNotFoundException] 2788FB48,Microsoft.Exchange.Management.RecipientTasks.GetMailbox
        + PSComputerName       
    : pod51038psh.outlook.com
    Adding Custom Attributes to User
    The operation couldn't be performed because object 'Joe.Cool2003' couldn't be found on 'CO1PR07A002DC01.NAMPR07A002.prod.outlook.com'.
        + CategoryInfo         
    : NotSpecified: (:) [Set-Mailbox], ManagementObjectNotFoundException
        + FullyQualifiedErrorId : [Server=CO1PR07MB125,RequestId=8319d220-b9dd-492f-8182-5083cf56e58b,TimeStamp=5/22/2014 4:24:00 AM] [FailureCategory=Cmdlet-ManagementObj
       ectNotFoundException] C7844A24,Microsoft.Exchange.Management.RecipientTasks.SetMailbox
        + PSComputerName       
    : pod51038psh.outlook.com
    User has been Provisioned in Office 365!
    Of course the user has been provisioned, but the CustomAttributes have not been assigned. :(

  • IDoc to csv file with required fields

    All,
    I have a source IDoc going to XI.  For example it contains source fields SF1 (required), SF2 (required), SF3(optional) and SF4(optional).
    I want to produce a target file using file content conversion.  I want to produce target fields TF1(required), TF2(required), TF3(optional), and TF4(optional).  In my mapping SF1 maps to TF1, SF2 maps to TF2, etc...
    I want to produce a comma separated file, I am using file content conversions and I am using NameA.fieldSeparator (using a comma)  in my file content conversion parameters.  I have no problem when all 4 source fields are populated.  In this case if the value of my source fields are: ABC 123 XYZ and 789 then I get a flat file with the result:
    ABC,123,XYZ,789
    The problem is when my optional fields are blank I currently get the following in my csv file:
    ABC,123
    When instead I want:
    ABC,123,,
    I know I've seen threads relating to this issue but I haven't had any success locating them.  Any insight is appreciated.

    Hi Shaun,
    Source : SF1 (required), SF2 (required), SF3(optional) and SF4(optional).
    Target: TF1(required), TF2(required), TF3(optional), and TF4(optional)
    For Optional in mapping , If the source optional field is not there, then for target assign with blank constant. You will get the output as below, because at target side the element will be there with blank value and FCC will process that.
    ABC,123,,
    Otherwise Try with NameA.fieldFixedLengths in FCC.
    Regards,
    Prasanna

  • Mapping IDOC to CSV file - Missing blank field values in CSV file

    Hello:
    I am mapping an IDOC to a .csv file, using File Content Conversion.  I specify comma as the value for the parameter 'fieldSeparator'.  I get a .csv file, but blank field values are missing in the .csv file.
    For example, if the .csv file format is <field1>,<field2>,<field3>
    and if <field2> is blank, then the .csv file contains
    <field1>,<field3>

    Hi Bac,
    As long as the XML that goes to your File Receiver Channel contains the elements, even if they don't contain values, it should work fine.
    For example:
    <field1>data</field1>
    <field2></field2>
    <field3>data2</field3>
    I suspect that your XML looks like the following:
    <field1>data</field1>
    <field3>data2</field3>
    If this is the case you can update your map. You shouldn't need to put a space, just make sure the element gets created.
    Thanks,
    Jesse

  • IDOC to CSV file getting created in Target with out any data

    Dear All,
    Scenario:IDOC to CSV
    We have developed a ID part and reused the IR part.Interface is successfull end to end in both ABAP and Java Stack and file has been created in Target but without any data.
    I think we need to do changes in Content Conversion.
    Below are the details which we have using in Content Conversion.
    Name                                                              Value
    ListPrice.fieldSeparator                                     ,
    ListPrice.ProcessFieldNames                        fromConfiguration
    ListPrice.fieldNames                                      CompanyID,SalesOrg,ProductID,ValidFrom,ValidTo,UOM,ListPrice,RecordType
                                                                         ,LineID,UpdatedDate
    can you please give me the parameter which we need to use other than above and give me the proper reason why target file has no data.
    Note:We are Re-Using IR part and we have given the field name in Content conversion as correct order with proper case sensitive
    Thanks and regards,
    Manikandan

    Hi Abhishek and Rahul,
    I m sorry i have given the sender Payload early. please find the receiver payload.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:SendListPriceToHub xmlns:ns0="http://tempuri.org/"><ns0:XMLData><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002648</ns0:ProductID><ns0:ValidFrom>2009-11-12T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>111.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22158</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>100000001</ns0:ProductID><ns0:ValidFrom>2009-11-23T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>2199-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>1000.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22363</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>100000003</ns0:ProductID><ns0:ValidFrom>2009-12-01T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>230.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22572</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002901</ns0:ProductID><ns0:ValidFrom>2000-12-04T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>2009-11-30T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>20.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22673</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002647</ns0:ProductID><ns0:ValidFrom>2009-12-04T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>90.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22674</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>100000007</ns0:ProductID><ns0:ValidFrom>2009-12-01T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>2010-01-19T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>900.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22715</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>100000010</ns0:ProductID><ns0:ValidFrom>2009-12-11T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>14.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22831</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002655</ns0:ProductID><ns0:ValidFrom>2009-12-16T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>350.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>22985</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002901</ns0:ProductID><ns0:ValidFrom>2009-12-01T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>80.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23196</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20003027</ns0:ProductID><ns0:ValidFrom>2010-01-04T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>120.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23309</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20003028</ns0:ProductID><ns0:ValidFrom>2010-01-06T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>12.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23428</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20002924</ns0:ProductID><ns0:ValidFrom>2010-01-07T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>96.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23469</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>20091001</ns0:ProductID><ns0:ValidFrom>2010-01-14T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>123.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23647</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice><ns0:ListPrice><ns0:CompanyID>1525</ns0:CompanyID><ns0:SalesOrg>A001</ns0:SalesOrg><ns0:ProductID>100000007</ns0:ProductID><ns0:ValidFrom>2010-01-20T00:00:0010:00</ns0:ValidFrom><ns0:ValidTo>9999-12-31T00:00:0010:00</ns0:ValidTo><ns0:UOM>PC</ns0:UOM><ns0:ListPrice>10.00</ns0:ListPrice><ns0:RecordType>U</ns0:RecordType><ns0:LineID>23681</ns0:LineID><ns0:UpdatedDate>0000-00-00T00:00:0000:00</ns0:UpdatedDate></ns0:ListPrice></ns0:XMLData></ns0:SendListPriceToHub>

  • How to convert a row value in CSV file to column value

    Hi
    We have one requirement where we have to show the row value to column level value
    for Example:
    Source file will be in below format
    Component Name :101
    **Batch #     **100% Mod     200% Mod      300% Mod      400% Mod     Tensile     Elongation     Thickness*
    8584-17     498     1211     1986     2601     3133 523     0.088
    Output Format has to be:
    **Batch #     **100% Mod     200% Mod      300% Mod      400% Mod     Tensile     Elongation     Thickness Component name*
    8584-17     498     1211     1986     2601     3133 523     0.088 101
    How can we achieve this using shell/Perl script

    .

  • Cannot upload csv file of customer data on sample application...

    Hi I am learning about APEX through the Oracle hosted site and I've been trying the data upload on the customers table.
    I downloaded the sample data from the customer table into a csv file using the apex download feature. I then just left the first row which
    contains the column names,deleted the other rows in the csv,except for one where I changed the data.
    When i try to upload the csv file back,using the sample app with comma as separator,and first row as column data, I get the error "Do not Load"
    in the column names although the row1 of the data displays correctly under each column.
    Here's the Copy and Paste text:
    "CUSTOMER_ID","CUST_FIRST_NAME","CUST_LAST_NAME","CUST_STREET_ADDRESS1","CUST_STREET_ADDRESS2","CUST_CITY","CUST_STATE",
    "CUST_POSTAL_CODE","PHONE_NUMBER1","PHONE_NUMBER2","CREDIT_LIMIT","CUST_EMAIL"
    "9","Rahul","Surname","46 Somewhere","Some Town","MyTown","MA","230","(230) 434-2443","(230) 733-4344","200","[email protected]"
    Can anyone help please?

    Hi again
    I did use a comma. I did it again just to be sure.
    Using the Data Workshop->Text Data and a csv file I was able to load the data without any problem.
    Using the same csv, but from the sample application,same problem. Pasting in the Data Field,same problem.
    Also I was trying to load about 20 records to another table from an Excel file,by copying the range and pasting in the Date Field of the Data Workshop.
    The rows displayed correctly in the Data Field but on loading,got "No Data Found".
    Saving the Excel data to a csv and again using the Data Workshop>Text Data >csv file etc,no problem at all,data was loaded instantly.
    The safest bet is to use the Data Workshop and a csv file,but thats only for developers.
    Probably a simple error somewhere ,but if (thats a big if) anyone could paste the data I posted earlier in the demo_customers and see if they have the same error,
    I would be grateful. Having endusers upload their Excel-based data is a central part of my planned site.Here it is again:
    "CUSTOMER_ID","CUST_FIRST_NAME","CUST_LAST_NAME","CUST_STREET_ADDRESS1","CUST_STREET_ADDRESS2","CUST_CITY","CUST_STATE",
    "CUST_POSTAL_CODE","PHONE_NUMBER1","PHONE_NUMBER2","CREDIT_LIMIT","CUST_EMAIL"
    "9","Rahul","Surname","46 Somewhere","Some Town","MyTown","MA","230","(230) 434-2443","(230) 733-4344","200","[email protected]".
    Regards

  • Duplicate Outbound IDocs getting Triggered at the same time.

    Hi Folks,
    I have created custom outbound idoc and done all the configurations required like WE20, WE30, WE31, WE81, WE82, WE 41, WE57 etc.
    Also I have written my code to populate segments and then call MASTER_IDOC_DISTRIBUTE in the custom function module which is assigned to the process code and also done the configurations like we57 etc. for the same.
    Now when I trigger my output type from VL74 transaction, I noticed two entries in we02 resulting in two IDocs generated for 1 HU.
    The difference in two IDocs is the first IDoc getting generated in error with status 29. While the second one as success with status 03.
    The data records containing segments have same values for both the IDocs number.
    Whether the configuration is a issue here or problem in code of custom fm?
    Please help.

    Hi Anil,
    Appreciate your quick response.
    In my custom FM I have populated an internal table it_master_idoc_data with two records of two segments, containing segment name in SEGNAM filed and segment data in SDATA field. 
    Please let me know whether the data in this internal table of structure EDIDD sufficient for passing to MASTER_IDOC_DISTRIBUTE FM? Or do I need to populate any other field of EDIDD.
    Also I am exporting a structure master_idoc_control containing values in this 5 fields.
    MESTYP = Z message type, Z IDOCTP, partner and details in  RCVPOR, RCVPRN, RCVPRT fields.
    I am not populating the internal table communication_idoc_control while calling FM MASTER_IDOC_DISTRIBUTE from my custom function module.
    After executing MASTER_IDOC_DISTRIBUTE FM, when the control comes back to my custom FM there is one record in communication_idoc_control  internal table with IDoc number in DOCNUM field with status 29(error). Now when we check in WE02 there are two IDocs generated after this transaction's execution.
    The first one in error which FM MASTER_IDOC_DISTRIBUTE returns and second one in success with status 03.
    Please help....
    Thanks,
    Pravesh

  • Read from csv file and plot particular columns

    Hello,
    I`m a new user of Labview and here it comes...my first major problem.
    Maybe this has been discussed before. I’ve made a search to solve my problem first but I couldn`t find anything helpful so I `ve decided to post a new message.
    So here is my problem:
    I`m working in a small semiconductor lab where different types of nitrides are grown using proprietary reactor. The goal is to read the collected csv files from each growth in Labview and plot the acquired data in appropriate graphs.
    I have a bunch of csv files and I have to make a Labview program to read them.
    The first part of my project I`ve decided to be displaying the csv file (growth log file) under labview (which I think works fine).
    The second one is to be able to plot particular columns from the recipe in graphs in Labview (that one actually gives me a lot of trouble):
    1. Timestamp vs Temperature /columns B and D/
    2. Timestamp vs Gas flow /columns L to S/
    3. Timestamp vs Pressure /columns E,K,T,U,V/
    I`ve got one more problem. How can I convert the Timestamp shown in csv file to human readable date in labview? This actually is a big problem, because the timestamp is my x axis and I want to know at what time a particular process took place and I also want to be able to see the converted timestamp when displaying csv file at first. I`ve read a lot about time stamping in excel and timestamp in labview but I`m still confused how to convert it in my case.
    I don`t have problems displaying csv file under Labview. My problems are with the timestamp and the graphs.
    Sorry for my awful English.  I hope you can understand my problems since English is not my mother language. 
    Please find the attached files.
    If you have any ideas or suggestions I`ll be more than happy to discuss them.
    Thank you in advance.
    Have a nice day! 
    Attachments:
    growth log.csv ‏298 KB
    Read from growth log.vi ‏33 KB

    Hello again,
    I`m having problems with converting the first column in the attached above file Growth Log.csv.
    I have a code converting xl timestamp to time and using Index Array traying to grab a particular column out of it but the attached file is written in strings so I guess I have to redo it in array but I don`t know how.Would you help me with this one?
    Attachments:
    Xl Timestamp to Time.vi ‏21 KB

  • OUTBOUND IDOCS PROCESSING ON ALE INTERFACE

    Hi experts...
        I am working on ALE IDOCS
    I have a requirement where i have to convert my outgoing idocs into flat files on my local system
    1) How do i create a file port and specify the location(directory)
    2) How do i assign this file port to the sending system
    3) Where will the idocs be stored ... can i check the location using AL11.
    Can anyone send the procedure for the above ..
    Also can anyone send me documents on "SAP to LEGACY system Interfacing"
    to my mail id      [email protected]
    Thanks and Regards

    Hi Ashok,
    Please check this link.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/2a1dd5d3-0801-0010-ed8d-bd797ed922cb
    open it as PDF.
    I think it will answer all your queries....
    Dont forget to award points if you get the answer:-)
    Regards,
    Vinay

  • How to read columns of a Microsoft Excel .csv file with LabVIEW?

    This should be simple.  Any ideas on how to read an Excel file with csv extension using LabVIEW?
    Thanks!

    Use the Read From SpeadSheet File.VI which you can find on the FILE I/O pallette.
    "Read From Spreadsheet File
    Reads a specified number of lines
    or rows from a numeric text file beginning at a specified character offset and
    converts the data to a 2D, single-precision array of numbers. You optionally can transpose the
    array. The VI opens the file before reading from it and closes it afterwards.
    You can use this VI to read a spreadsheet file saved in text format. This VI
    calls the Spreadsheet String
    to Array function to convert the data."
    The CSV file is essentially a text file which uses commas as delimiters in the files structure.

  • I am trying to export a Numbers spreadsheet to a csv file, but it does not put the commas in

    I am trying to export a Numbers spreadsheet to a csv file, but it does not put the commas in.  I want to use it with an HTML table generator tool, but the tool is looking for commas.   The Export to CSV exports it as a spreadsheet with all the formatting removed, and no commas.
    Here is the html table tool:
    http://www.textfixer.com/html/csv-convert-table.php

    Numbers '09 create CSV files with comma separated values if and only if your system is using decimal period.
    If the system is using decimal comma, the CSV files are created using semi-colon as separator.
    Yvan KOENIG (VALLAURIS, France)  dimanche 11 décembre 2011 11:11:25
    iMac 21”5, i7, 2.8 GHz, 12 Gbytes, 1 Tbytes, mac OS X 10.6.8 and 10.7.2
    My iDisk is : <http://public.me.com/koenigyvan>
    Please : Search for questions similar to your own before submitting them to the community

  • Reload Idoc from archive file

    Hi,
    I know the program RSEXARCL which will reload the idoc from archive file.
    If there is any SAP program to reload the particular outbound idoc from archive file by giving idoc number in the selection screen?
    After reloading to database ,we need to send this idoc to external system .
    Kindly help.
    Thanks,
    Anil

    Hi,
    Please check the below link:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/17c435df-0901-0010-72a7-b9dcd271213d?QuickLink=index&overridelayout=true&5003637692468
    Regards,
    Rajesh

  • Read .csv file to XI and convert to IDOC

    Hi XI  Gurus,
    We have a requirement at work where we need to import .csv files to XI ehich then XI converts it to xml and posts an IDOC to SAP. I have tried creating this in XI with File Adapter with File Content Conversion.
    I gave all the things required for File Content Conversion but still my .csv file is not being picked up and read by XI. Iam not sure why this is happening but can anayone please help me with this.
    Also please do let me know if anyone has a tutorial related to this scenario.
    Thanks,
    Mayuresh.

    Hi All,
    I tried checking the error via Run Time Workbench (RWB) and going to Communication Channel Monitoring. It shows that there are some errors in the Communication Channel but when i go inside  it does not open an error window for me containing the error details.
    Just wanted to explain my file structure and the settings i have done in the 'File Content Conversion' of the 'File Adapter' so that you guys can tell me where i am wrong:
    Message Type: VENDOR_MT
    VENDOR_MT
        VendorNumber
        LastName
        SearchTerm
        Currency
      Address
          Street
          City
          Zip
          Country
    I am using IDOC receiver at the other end (CREMAS.CREMAS03).
    CSV File Structure: (Sample)
    34567, Reed, Jon, AUD, Smith Street, Melbourne, 3066, AU
    The settings that i have used in the File Adapter (Content Conversion tab) are:
    RecordSet Structure: VendorNumber, LastName, SearchTerm, Currency, Street, City, Zip, Country
    RecordSet Sequence: Ascending
    ignoreRecordSetName: true
    Vendor_MT.fieldNames: VendorNumber, LastName, SearchTerm, Currency
    Vendor_MT.keyFieldValue: 01
    Vendor_MT.fieldSeperator: ','
    Vendor_MT.endSeperator: nl
    Address.fieldNames: Street, City, Zip, Country
    Address.keyFieldValue: 02
    Address.fieldSeperator: ','
    Address.endSeperator: nl
    Please do let me know if the communicatioon channel settings i have made are correct. Also let me know if i am missing some major setting.
    Thanks,
    Mayuresh.

Maybe you are looking for

  • Data Sources for the Master data of FI-CO,MM and SD from R3

    Hi Gurus , Could you please tell me what are the main datasources of FI-CO,MM and SD needed from R/3 for the master data to be loaded in new implementation of BI . Could you please guide me step by step . thanks in advance , Pratham

  • How to clear Google Maps History on iPhone 3G?

    My wife and I have been looking for a house and we have been using my iPhone as a GPS to get from one address to the next. There are alot of address stored in my history now on Google Maps (for instance if I press 1 about 10 different addresses I hav

  • HT5447 Airplay and Apple TV

    I have Lion on my iMac and don't seem to have AirPlay, so does that mean I cannot mirror movies from my iMac to the AppleTV? If I can, then please tell me how.

  • GR against Production Order - IDoc

    Hi, At Goods Receipt against a Production Order (Transaction MB31, movement type 101), I need to send an ASN (DELVRY03) like IDoc. How do I proceed with this requirement? Thanks for all your help. Thanks, Sneha

  • How to display games on my TV from mi ipod touch 4G?

    Well I want to know how to display games on my tv. If it's possible.