Splitting the Idoc data into 2 diff files based on file content.

Hi Experts,
                           Mine is a SAP - XI - File Scenerio. A perticular Idoc when triggered has a segment 4 with a Field KONDART. This Segment 4 is optional in nature and can appear n nos of times. Initially KONDART carried a value ABCD  and created a File called PLU, containing 14 fields.
   Now the value of KONDART can carry ABCD as wll as many other values.
The current requirement is , if the Value of KONDART is ABCD , EFGH, etc etc , then it will create the PLU file as usual, but the moment the KONDART value is XYZ , then it will create the PLU as well as a new file called MRP, carrying only 4 fields .
    I can understand that I need to make a new DT, MT , MI . I can' make out, if i call it in Messege Mapping the PLU as well as MRP structure and map them according ly, I simply cant make out , hw to make it conditional based on the content of KONDART.
    Should I go for messege splittng, like in the second messege type , the Structure will get mapping data only if the KONDART Value is XYZ.
Is my understanding OK.
I think I can check this in Messege Mapping itself. calling 2 messege types and put a CREATEIF kind of logic for the KONDART = XYZ condition.
Pls correct me / Comment .
Regards,
Arnab Mondal.

Hi,
If I have understood your problem correctly then the solutin can be like this..
First of all create two separate mappings as well as Interface mappings (based on your different data types).
In ID, just do the conditional "Interface Determination", so whenever your condition(s) will be true the respective interface mapping(s) will be triggered and offcourse create two CC respectively.
Now when you do the "Receiver Determination" assing correct CC to corresponding interfaces.
Regards,
Sarvesh

Similar Messages

  • Splitting the single record into multiple records based on validity

    Hi Guru's,
    basically i am an BI consultant with less knowledge on ABAP, can i request your help on the ABAP task.
    I am working on HR module which is integrated with SAP BI,  the reports will be executed based on calendar month the requirement is i should split the single record into a multiple records based on validity of the record.  basically the HR data would be in data from and date to. 
    below is the logic
    Check whether the start and end date of the record are in the same month and year.
    If yes  nothing changes
    If no  create multiple records
    1st record  original start date of the record u2018till end of that month
    Following record  1st of the next month  u2018till last day of the month
    u2026
    Last record  1st of the month u2018till original end date.
    All fields will have the same values, only the datefrom and dateto fields change.
    Can any one please provide me the same code to proceed on my task.
    Thanks and Regards,
    Venkat

    Hi,
    Using Rule group we can split it.
    Using Rule Group in SAP-BI  Part - 1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/using%20rule%20group%20in%20sap-bi%20%20part%20-%201.pdf
    Thanks
    Reddy

  • How to copy the table datas into an  open office excel file .

    Hello All ,
    I have a form which contains several pages . Each page contains some tables in it . How can we copy each table datas into an open office execl files ??
    I mean there will be several excel files each file corresponds to a particular table in the form. I am using LiveCycle Designer ES 2 . in Windows XP.
    Thanks
    Bibhu.

    Hi Bibhu,
    Here is a screenshot of exporting the form data to XML:
    I have not imported XML into Excel, but I know that Excel 2007 has tools for this in the Developer tab. I don't have Excel 2007 on this machine, so I can't get screenshots. However using Acrobat you can merge the XML file into a spreadsheet, using the same menu Forms > Manage Form Data > Merge Data Files into Spreadsheet.
    In addition you can set up a data connection to the spreadsheet. However this will only work in a Windows environment and you need to share the OLEDB data connection with all users. In addition if you Reader Enable the form with Acrobat, the data connection will not work.
    So on balance if you want to get data to a spreadsheet I would recommend export XML.
    Hope that helps,
    Niall

  • Split the incoming data into multiple grouped output records

    I have three fields in the source, Student ID, Student name and Student Marks. I need to map the details in the destination, by grouping the data on the basis of marks obtained. Each time there's a new mark , the corresponding details of names and student
    ID is saved under the a new mark group that is created. how do i come about it when there are n number of new marks?

    for your scenario i used below xml as input,
    <ns0:Students xmlns:ns0="http://BTSTempProj.StudentDetailsIn">
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>10</StudentMarks>
      </Student>
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>20</StudentMarks>
      </Student>
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>10</StudentMarks>
      </Student> 
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>10</StudentMarks>
      </Student>
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>20</StudentMarks>
      </Student>
      <Student>
        <StudentID>StudentID_0</StudentID>
        <StudnetName>StudnetName_0</StudnetName>
        <StudentMarks>30</StudentMarks>
      </Student>   
    </ns0:Students>
    and here is the output, hope this is what you are looking for. 
    <ns0:Students xmlns:ns0="http://BTSTempProj.StudentDetailsOut">
    <StudentMarks>10</StudentMarks>
    <Student>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    </Student>
    <StudentMarks>20</StudentMarks>
    <Student>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    </Student>
    <StudentMarks>30</StudentMarks>
    <Student>
    <StudentName>StudnetName_0</StudentName>
    <StudentID>StudentID_0</StudentID>
    </Student>
    </ns0:Students>
    Please find the below xslt which you can use, it is basically based on the Muenchian grouping suggested by Ashwin
    <?xml version="1.0" encoding="UTF-16"?>
    <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt" xmlns:var="http://schemas.microsoft.com/BizTalk/2003/var" exclude-result-prefixes="msxsl var s0" version="1.0"
    xmlns:ns0="http://BTSTempProj.StudentDetailsOut" xmlns:s0="http://BTSTempProj.StudentDetailsIn">
      <xsl:output omit-xml-declaration="yes" method="xml" version="1.0" />
      <xsl:template match="/">
        <xsl:apply-templates select="/s0:Students" />
      </xsl:template>
      <xsl:key name="groups" match="Student" use="StudentMarks"/>
      <xsl:template match="/s0:Students">
        <ns0:Students>
          <xsl:for-each select="Student[generate-id(.)=generate-id(key('groups',StudentMarks))]">
            <xsl:sort select="StudentMarks" order="ascending"/>
            <StudentMarks>
              <xsl:value-of select="StudentMarks/text()"/>
            </StudentMarks>    
            <Student>    
              <xsl:for-each select="key('groups',StudentMarks)">
                <StudentName>
                  <xsl:value-of select="StudnetName/text()"/>
                </StudentName>
                <StudentID>
                  <xsl:value-of select="StudentID/text()"/>
                </StudentID> 
                </xsl:for-each>
            </Student>
          </xsl:for-each>        
        </ns0:Students>
      </xsl:template>
    </xsl:stylesheet>
    Regards, Amit More

  • How to split the IDOCS based on document number change whit out BPM

    Hi all,
    Thanks,for giving the responce..
    Scenario:File to IDoc.
    Problum1 : How to Split the IDocs based on document number change in the source file with out BPM.My file contains document numbers like
    20000092
    20000092
    20000092
    50000050
    50000050
    50000065
    I want 3 IDocs in target system.i.e 1 for 20000092,20000092,20000092
                                                       2 for 50000050,50000050
                                                       3 for 5000006
    By using external definations i am getting 6 IDOCs insted of 3.
    Problum 2:Is there any chnges/modifications in Directory when we are using external definations.
    Could u plz provide me the step by step process(Repository/Directory) with using of external definations.
    Thanks in advance.
    Regards,
    KP

    HI,
    for this no need of BPM.
    You can think of Idoc bundling concept to acheive this-just you need to do the external definition to change the idoc occurence
    /people/michal.krawczyk2/blog/2005/12/04/xi-idoc-bundling--the-trick-with-the-occurance-change
    to achieve for each document no, one idoc, you can write small user defined function in the mapping with context handling you an achieve this.
    For this e.g
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/6bd6f69a-0701-0010-a88b-adbb6ee89b34
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/877c0d53-0801-0010-3bb0-e38d5ecd352c
    Regards,
    Moorthy

  • Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q with a real time mode but it is not working but when i run it with uploading it into the PXI it save in to the file

    Hi am trying to save Data into a write to measurement file vi using a NI PXI 1042Q and DAQ NI PXI-6229 with a real time mode but it is not working but when i run it without uploading it into the PXI it save in to the file please find attached my vi
    Attachments:
    PWMs.vi ‏130 KB

     other problem is that the channel DAQmx only works at real time mode not on stand alone vi using Labview 8.2 and Real time 8.2

  • How to download the script data into pdf file

    how to download the script data into pdf file
    i have one option to download the script data to pdf file --->rstxpdft4 program.
    i have one doubt how to use this proogram.or any function module to download the script data to pdf file.
    Thanks and regards,
    Sri.

    Hi      Sri Sai,
    I know one method to convert the sapscript to pdf file :
    first generate a Spool Request for the required Sapscript
    then goto transaction SP01 and copy the generated Spool Request number
    now execute the SAP report RSTXPDFT4
    here enter the copied Spool request number and the target directory into the parameters
    execute the report
    required pdf file will be generated into the target directory
    i hope it will help you out
    Please refer this simple program:
    http://www.sapdevelopment.co.uk/reporting/rep_spooltopdf.htm
    Reward points if found helpful....
    Cheers,
    Eshwar.

  • RE: How to Export the Table data Into PDF File  in ADF

    Hi Experts,
    I am using Jdeveloper 11.1.2.3.0
    I am created employee VO and Drag and Drop as a Table in a page. So need to Export the Table data into A PDF file.
    So please give me some suggestions regarding this Scnerio.
    With Regards,
    satish

    Hi Guys ,
    Any more answers for this question.
    Please find my jsff below
    <?xml version='1.0' encoding='UTF-8'?>
    <jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"
              xmlns:f="http://java.sun.com/jsf/core" xmlns:report="http://www.adfwithejb.blogspot.com">
      <af:panelGroupLayout layout="vertical" id="pgl2">
          <af:query id="qryId1" headerText="Service Tariff Mapping Details" disclosed="true"
                    value="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    model="#{bindings.findByTarifValidFromQuery.queryModel}"
                    queryListener="#{reportWiseInvoiceBean.genericQueryListener}"
                    queryOperationListener="#{bindings.findByTarifValidFromQuery.processQueryOperation}"
                    resultComponentId="pc1::t2">
         <f:attribute name="queryExpression" value="bindings.findByTarifValidFromQuery.processQuery"/>
                          </af:query>
        <af:panelCollection id="pc1" styleClass="AFStretchWidth">
          <f:facet name="menus"/>
          <f:facet name="toolbar">
              <af:toolbar id="t1">
                 <af:menuBar id="pt_m1">
                <report:reportDeclarative ButtonName="ExportToExcel" ReportName="ServiceTariffMappingDetails"
                                          ReportType="PDF" TableId=":::pc1:t2" id="rd1" Pagination="true"/>
                <af:commandButton text="excel" id="cb1" binding="#{exportToExcelBean.exportID}">
                <af:setActionListener from="pt1:pgl1:pgl2:pc1:t2" to="#{viewScope['exporter.exportedId']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.thStyle']}"/>
                <af:setActionListener from="border:1px solid #cccccc" to="#{viewScope['exporter.tdStyle']}"/>
                <af:fileDownloadActionListener method="#{exportToExcelBean.exportToExcel}" filename="Service TariffMapping.xls"
                                                 contentType="text/excel;chatset=UTF-8;"/>
                </af:commandButton>
                <af:commandMenuItem id="pt_cmi133" icon="/images/common/Excel-icon.png"
                                                shortDesc="ExportToExcel"
                                >
                                <af:exportCollectionActionListener exportedId="t2" type="excelHTML"
                                                                   title="Service Tariff Mapping"
                                                                   filename="Service Tariff Mapping.xls"/>
                            </af:commandMenuItem></af:menuBar>
              </af:toolbar>
          </f:facet>
          <f:facet name="statusbar"/>
          <af:table value="#{bindings.ServiceTariffMappingDtlsRVO1.collectionModel}" var="row"
                    rows="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}"
                    emptyText="#{bindings.ServiceTariffMappingDtlsRVO1.viewable ? 'No data to display.' : 'Access Denied.'}"
                    fetchSize="#{bindings.ServiceTariffMappingDtlsRVO1.rangeSize}" rowBandingInterval="0"
                    filterModel="#{bindings.findByTarifValidFromQuery.queryDescriptor}"
                    queryListener="#{bindings.findByTarifValidFromQuery.processQuery}" filterVisible="true" varStatus="vs"
                    id="t2" columnStretching="last" binding="#{ServiceTariffMappBean.testTable}">
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                       id="c1">
              <af:inputText value="#{row.bindings.NormalTariffCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NormalTariffCode.tooltip}" id="it1">
                <f:validator binding="#{row.bindings.NormalTariffCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                       id="c2">
              <af:inputText value="#{row.bindings.ServiceCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceCode.tooltip}" id="it2">
                <f:validator binding="#{row.bindings.ServiceCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}" id="c3">
              <f:facet name="filter">
                <af:inputDate value="#{vs.filterCriteria.TrfVldFrm}" id="id1">
                  <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
                </af:inputDate>
              </f:facet>
              <af:inputDate value="#{row.bindings.TrfVldFrm.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.displayWidth}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.tooltip}" id="id2">
                <f:validator binding="#{row.bindings.TrfVldFrm.validator}"/>
                <af:convertDateTime pattern="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfVldFrm.format}"/>
              </af:inputDate>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                       id="c4">
              <af:inputText value="#{row.bindings.ServiceDesc.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ServiceDesc.tooltip}" id="it3">
                <f:validator binding="#{row.bindings.ServiceDesc.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}" id="c5">
              <af:inputText value="#{row.bindings.OtTrfCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtTrfCode.tooltip}" id="it4">
                <f:validator binding="#{row.bindings.OtTrfCode.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}" id="c6">
              <af:inputText value="#{row.bindings.OtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.OtUnitRate.tooltip}" id="it5">
                <f:validator binding="#{row.bindings.OtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}" id="c7">
              <af:inputText value="#{row.bindings.NtUnitRate.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.NtUnitRate.tooltip}" id="it6">
                <f:validator binding="#{row.bindings.NtUnitRate.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}" id="c8">
              <af:inputText value="#{row.bindings.TrfGrt.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.TrfGrt.tooltip}" id="it7">
                <f:validator binding="#{row.bindings.TrfGrt.validator}"/>
              </af:inputText>
            </af:column>
            <af:column sortProperty="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.name}" filterable="true"
                       sortable="true" headerText="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                       id="c9">
              <af:inputText value="#{row.bindings.ChargePartyCode.inputValue}"
                            label="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.label}"
                            required="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.mandatory}"
                            columns="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.displayWidth}"
                            maximumLength="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.precision}"
                            shortDesc="#{bindings.ServiceTariffMappingDtlsRVO1.hints.ChargePartyCode.tooltip}" id="it8">
                <f:validator binding="#{row.bindings.ChargePartyCode.validator}"/>
              </af:inputText>
            </af:column>
          </af:table>
        </af:panelCollection>
      </af:panelGroupLayout>
    </jsp:root>

  • To get the table data into a file in pl/sql developer

    Hi
    i have table with 90k rows, i want to get the table data into a file like excel....
    i executed the select statement it shows only 5k rows because of buffer problem.
    can u please help me any alternative way to get the table data into a file.
    using cursors like that

    Really? excel for 90K rows :)
    face/desk/floor/"Hi and sorry neighbours below, it's me again"
    Err, I see you point, thanks Dang, I completely missed the 90k recs part, I must be getting old or modern ;)
    @Ramanjaneyulu,
    can u please help me any alternative way to get the table data into a file.You can always dump a query to a file, check these:
    http://tkyte.blogspot.com/2009/10/httpasktomoraclecomtkyteflat.html
    How to create Excel file (scroll down when necessary)
    http://forums.oracle.com/forums/search.jspa?threadID=&q=export+to+excel&objID=f137&dateRange=all&userID=&numResults=15&rankBy=10001
    Depending on your database version ( the result of: select * from v$version; ) you might have some other options.

  • Split the IDOC into multiple IDOC if the IDOC has more than 500 records

    Hi All,
    I developed an outbound IDOC in which we are facing an issue.
    There is some limitation on the maximum idoc size it can handle.
    If number of records is more than 500, split the idoc into multiple iDoc's, e.g. if it would have 1300 records , the result would be 2 iDoc's with 500 records, and the last one would have 300 records
    How can i acheive this.
    Regards
    Jai

    Hi,
    1) first you need to know which message type/Idoc type you are triggering.
    2) Get the Corresponding processcode from Partner profiles(WE20/ WE41).
    3) Then the look for prper user-exit in the related processing FM.
    4) write logic to split the IDoc accordingly.
    if no proper user exit available then copy the standard processing FM and need to all ALE related configurations.
    Catch hold any ABAP expert in your team to do all these.
    Suresh

  • Java.io.NotSerializableException when overwrite the JTable data into .txt file

    hi everyone
    this is my first time to get help from sun forums
    i had java.io.NotSerializableException: java.lang.reflect.Constructor error when overwrite the JTable data into .txt file.
    At the beginning, the code will be generate successfully and the jtable will be showing out with the data that been save in the studio1.txt previously,
    but after i edit the data at the JTable, and when i trying to click the save button, the error had been showing out and i cannot succeed to save the JTable with the latest data.
    After this error, the code can't be run again and i had to copy the studio1.txt again to let the code run 1 more time.
    I hope i can get any solution at here and this will be very useful for me.
    the following is my code...some of it i create it with the GUI netbean
    but i dunno how to attach my .txt file with this forum
    did anyone need the .txt file?
    this is the code that suspect maybe some error here
    private void jButton1ActionPerformed(java.awt.event.ActionEvent evt) {
    String filename = "studio1.txt";
              try {
                  FileOutputStream fos = new FileOutputStream(new File(filename));
                  ObjectOutputStream oos = new ObjectOutputStream(fos);
                   oos.writeObject(jTable2);
                   oos.close();
              catch(IOException e) {
                   System.out.println("Problem creating table file: " + e);
                   return;
              System.out.println("JTable correctly saved to file " + filename);
    }the full code will be at the next msg

    this is the part 1 of the code
    this is the full code...i had /*....*/ some of it to make it easier for reading
    package gui;
    import javax.swing.*;
    import java.io.*;
    public class timetables extends javax.swing.JFrame {
        public timetables() {
            initComponents();
        @SuppressWarnings("unchecked")
        private void initComponents() {
            jDialog1 = new javax.swing.JDialog();
            buttonGroup1 = new javax.swing.ButtonGroup();
            buttonGroup2 = new javax.swing.ButtonGroup();
            buttonGroup3 = new javax.swing.ButtonGroup();
            buttonGroup4 = new javax.swing.ButtonGroup();
            jTextField1 = new javax.swing.JTextField();
            jLayeredPane1 = new javax.swing.JLayeredPane();
            jLabel6 = new javax.swing.JLabel();
            jTabbedPane1 = new javax.swing.JTabbedPane();
            jScrollPane3 = new javax.swing.JScrollPane();
            jTable2 = new javax.swing.JTable();
            jScrollPane4 = new javax.swing.JScrollPane();
            jTable3 = new javax.swing.JTable();
            jButton1 = new javax.swing.JButton();
            jButton2 = new javax.swing.JButton();
    /*       org.jdesktop.layout.GroupLayout jDialog1Layout = new org.jdesktop.layout.GroupLayout(jDialog1.getContentPane());
            jDialog1.getContentPane().setLayout(jDialog1Layout);
            jDialog1Layout.setHorizontalGroup(
                jDialog1Layout.createParallelGroup(org.jdesktop.layout.GroupLayout.LEADING)
                .add(0, 400, Short.MAX_VALUE)
            jDialog1Layout.setVerticalGroup(
                jDialog1Layout.createParallelGroup(org.jdesktop.layout.GroupLayout.LEADING)
                .add(0, 300, Short.MAX_VALUE)
            setDefaultCloseOperation(javax.swing.WindowConstants.EXIT_ON_CLOSE);
            jLayeredPane1.add(jLabel6, javax.swing.JLayeredPane.DEFAULT_LAYER);
            String filename1 = "studio1.txt";
            try {
                   ObjectInputStream ois = new ObjectInputStream(new FileInputStream(filename1));
                   jTable2 = (JTable) ois.readObject();
                   System.out.println("reading for " + filename1);
              catch(Exception e) {
                   System.out.println("Problem reading back table from file: " + filename1);
                   return;
            try {
                   ObjectInputStream ois = new ObjectInputStream(new FileInputStream(filename1));
                   jTable3 = (JTable) ois.readObject();
                   System.out.println("reading for " + filename1);
              catch(Exception e) {
                   System.out.println("Problem reading back table from file: " + filename1);
                   return;
            jTable2.setRowHeight(20);
            jTable3.setRowHeight(20);
            jScrollPane3.setViewportView(jTable2);
            jScrollPane4.setViewportView(jTable3);
            jTable2.getColumnModel().getColumn(4).setResizable(false);
            jTable3.getColumnModel().getColumn(4).setResizable(false);
            jTabbedPane1.addTab("STUDIO 1", jScrollPane3);
            jTabbedPane1.addTab("STUDIO 2", jScrollPane4);
            jTextField1.setText("again n again");
            jLabel6.setText("jLabel5");
            jLabel6.setBounds(0, 0, -1, -1);
            jButton2.setText("jButton2");
            jButton1.setText("jButton1");
            jButton1.addActionListener(new java.awt.event.ActionListener() {
                public void actionPerformed(java.awt.event.ActionEvent evt) {
                    jButton1ActionPerformed(evt);
          

  • Split the incoming records into partitions

    Guru's,
    I have a table with around 17,000 records each day to get processed through informatica. All these records are distinct in every attribute and the status of the records will be in 'PENDING_CLEAR'.
    My requirement is to read these records into 8 partitions (if possible equal partitions) so that i could run in 8 partitions in informatica.
    I have the below options available in informatica
    1. Pass through
    2. Key range and
    3. Database partioning.
    Option 2 and Option 3 are not possible, since the data base is not partitioned and i will not be able to provide the key ranges, since the primary key on the table is an auto-increment number.
    In pass through i will be reading the same 17000 records across all the pipelines and its a challenge to handle it in infomatica.
    Any suggestions of paritioning the records using sql??
    Thanks a lot for your time.

    >
    Yes. But the way informatica has got options to read and process the record it is complicating more. I am trying to sort the issue related to informatica.
    >
    And that reinforces what I said above: you are likely 'misusing' Informatica if you have that sort of problem.
    A middle-tier application like Informatica does not provide a 'one size fits all' solution without making (in some cases serious) performance trade-offs.
    Such tools function best when they act as the 'middle man' among multiple databases. Need to do some basic data manipulation and source data from both Oracle and DB2? A tool can access both DBs with separate connections and help you merge or filter the data. The tool can also assist in performing some basic data transformations.
    Need to sort data and apply complex business rules? That work should be done in the database.
    The primary goal, when working with multiple database sources or targets, is the do AS MUCH work as possible in the DB; that is what the DB is designed for.
    Get the data INTO the DB as quickly and efficiently as possible and then do the work there. Mid-tier tools can be very effective at that. They can source data from multiple systems, do basic data cleansing and filtering to reject/fix/report 'dirty' data and they can consolidate multiple data streams to feed ONE target stream.
    Tools such as informatica use a proprietary language and syntax. DBs like Oracle and DB2 base their syntax on well-established ANSI standards. There is NO comparison.
    Versions of Informatica that I worked with didn't even allow you to access the code very easily. The code for individual businsess rules was locked into the objects it was attached to (some flexibility using maps and maplets). A developer has to 'open' the object in order to get access to the actual code that was being use.
    The PL/SQL code used in DBs is readily accessible and easy to access.
    The first question I would ask about your issue is: can this work (processing each record) be done in the database. If the answer is yes then that is most likely where the work should be being done. The 'tool' should then be used to do as much 'preprocessing' of the data as possible and then get the cleansed data into the database (into staging tables) as quickly as possible.
    If you insist on going the way you are headed you will need to add code to 'chunk' the data.
    See my reply if this thread for a description of how to use the NTILE function to do that
    Re: 10g: parallel pipelined table func - distributing DISTINCT data sets
    And for more discussion of why you should NOT be going this way here is a thread from last year with a question very similar to yours also using Informatica
    Looking for suggestion on splitting the output  based on rowid/any other
    >
    I have below query.I need to split total rows pulled by the query into two halves by maintaining data accuracy.
    Eg if query returns 500 rows,i need a query which return 250 another with 250.
    Reason :
    I have a informatica ETL job which pull data by above query and updates into target.Run time is 2hrs
    In order to reduce the run time,using 2 pass through partitions which will run the query in two partitions and run time will be reduced to exact 1 hour.
    >
    Sound familiar? Read what the responders (including me) had to say.

  • How to download the Dynamic data into PPT format

    Hi Friends,
    I have one doubt on WDJ. How to download the Dynamic data into PPT format. For Example Some Dynamic data is available in to View in that One Download Link or button available. Click on Download link or button download that data into PPT Format
    Is it possible for WDJ. If possible please tell me.
    Or
    How to create Business Graphics in Web Dynpro Applications depening up on Excel Data and finally we can download the  Business Graphics  into powerpoint presentation.
    Thank you,
    Regards
    Vijay Kalluri
    Edited by: KalluriVijay on Mar 11, 2011 6:34 AM

    Hi Govindu,
    1. I have one doubt on WDJ. Click on either Submit Buttion or LinkToURL UI we can download the file that file having ppt formate(Text.PPT).
    I am using NWDS Version: 7.0.09 and Java Version: JDK 1.6
    2. is it possible to download the business Graphics in to the PPT by using Java DynPro
    Regards
    VijayK

  • How to save the form data into adobe db?

    Hi All,
    How to save the form data into adobe db?
    I have designed one xdp file.
    Through processFormSubmission(), I got the submitted form data as Document obj.
    Then I have called the workflow kickoff program.
    code:
    InvocationRequest request = myFactory.createInvocationRequest ("myprocessname", //Specify the long-lived process name
    "invoke", //Specify the operation name
    params, //Specify input values (HashMap obj)
    false); //Create an asynchronous request
    It successfulyy started the workflow, but the submitted form data is not saved anywhere.
    And also, How get the form data from tables?
    Please provide the solution for the above.
    Thanks in advance.
    Regards,
    Saravanan G

    You need to create a process variable of type IN if you want to be able to pass data to your process. Then the params parameter (HashMap) contains a list of all the IN variables with their content that you want to pass to your process. They key is the name of the variable and the value the content. That way you should get it in your process.
    Now LiveCycle will create a column in the database for every process variable, so the content will be saved in the database just by creating that process variable.
    Jasmin

  • What is the smallest data structure record in a .TXT file record to be recognized as an Apple Address Book "Data Card"?

    Hello! This is my first time in this discussion group. The question posed is the subject line itself:
    What is the smallest data structure record in a .TXT file record to be recognized as an Apple Address Book "Data Card"?
    I'm lazy! As a math instructor with 40+ students per class per semester (pCpS), I would rather not have to create 40 data cards pCpS by hand, only to expunge that info at semester's end. My college's IS department can easily supply me with First name, Last name, and eMail address info, along with a myriad of other fields. I can manipulate those data on my end to create the necessary .TXT file, but I don't know the essential structure of that file.
    Can you help me?
    Thank you in advance.
    Bill

    Hello Bill, & welcome aboard!
    No idea what  pCpS is, sorry.
    To import a text file into Address Book, it needs to be a comma delimited .csv file, like...
    Customer Name,Company,Address1,Address2,City,State,Zip
    Customer 1,Company 1,2233 W Seventh Street,Unit 543,Seattle,WA,99099
    Customer 2,Company 2,1 Park Avenue,,New York,NY,10001
    Customer 3,Company 3,65 Loma Linda Parkway,,San Jose,CA,94321
    Customer 4,Company 4,89988 E 23rd Street,B720,Oakland,CA,99899
    Customer 5,Company 5,432 1st Avenue,,Seattle,WA,99876
    Customer 6,Company 6,76765 NE 92nd Street,,Seattle,WA,98009
    Customer 7,Company 7,8976 Poplar Street,,Coupeville,WA,98976
    Customer 8,Company 8,7677 4th Ave North,,Seattle,WA ,89876
    Customer 9,Company 9,4556 Fauntleroy Avenue,,West Seattle,WA,98987
    Customer 10,Company 10,4 Bell Street,,Cincinnati,OH,89987
    Customer 11,Company 11,4001 Beacon Ave North,,Seattle,WA,90887
    Customer 12,Company 12,63 Dehli Street,,Noida,India,898877-8879
    Customer 13,Company 13,63 Dehli Street,,Noida,India,898877-8879
    Customer 14,Company 14,63 Dehli Street,,Noida,India,898877-8879
    Customer 15,Company 15,4847 Spirit Lake Drive,,Bellevue,WA,98006
    Customer 16,Company 16,444 Clark Avenue,,West Seattle,WA,88989
    Customer 17,Company 17,6601 E Stallion,,Scottsdale,AZ,85254
    Customer 18,Company 18,801 N 34th Street,,Seattle,WA,98103
    Customer 19,Company 19,15925 SE 92nd,,Newcastle,WA,99898
    Customer 20,Company 20,3335 NW 220th,2nd Floor,Edmonds,WA,99890
    Customer 21,Company 21,444 E Greenway,,Scottsdale,AZ,85654
    Customer 22,Company 22,4 Railroad Drive,,Moclips,WA,98988
    Customer 23,Company 23,89887 E 64th,,Scottsdale,AZ,87877
    Customer 24,Company 24,15620 SE 43rd Street,,Bellevue,WA,98006
    Customer 25,Company 25,123 Smalltown,,Redmond,WA,98998
    Try Address Book Importer...
    http://www.sillybit.com/abee/

Maybe you are looking for

  • Web IC Sales Order ERP Integration (Workcentre IC_SLO_ERP)

    System: CRM 2007 s4 and ECC 6 I have integrated ERP Sales Order within my Web IC aplication.  I'm able to create ERP orders from within the CRM based Web IC using the work centre IC_SLO_ERP.  I can save the orders and the orders are saved to SAP ECC

  • I - cloud e-mail frequently not loading, firefox crashing frequently

    Very disgusted with i - cloud so far - no advantage to me, as I have only desktop Imac, no cell phone, tablet, laptop, etc to "push" info too - for me, this is a huge step backwards from mobile me to i - cloud - I am often not having my e-mail messag

  • I cannot get Photoshop CS6 to download to my new computer

    Why can't I download Photoshop cs6 onto my new PC? I have the original Photoshop CS3 disk which has been upgraded online to CS6 with new serial number. I could not find any way to download either a trial or the programme I bought. My new computer is

  • Saving has stopped working!

    Photoshop CS2. If I save a file as a GIF, JPG or PNG I can open the files again in Photoshop but Windows can't see the files and neither can Image Ready or Dreamweaver

  • Protect Non user specific variant

    HI all, I want to save a variant available for all users, the problem is that I don't want none apart of me to modify it. I could create a User-Specific variant, but this way I am the only one who will have access to it. I want to create a variant fo