How to process NAST record.....

Hi all,
    can anyone tell me how to process Nast record with dispatch time 3 : send periodically with own transaction and with dispatch time 2.
thanks in advance.
vinod.

I think you can process using program RSNAST00
Regards
MD

Similar Messages

  • How to process each records in the derived table which i created using cte table using sql server

    I want to process each row from the CTE table I created, how can I traverse from first row to second row and so on....
    how to process each records in the derived table which i created using  cte table using sql server

    Ideally you would be doing a set based processing rather than traversing row by row as thats more efficient. To answer it specific to your scenario we may need more info. Can you explain with some sample data your exact requirement?
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to process next record in oracle PLSQL

    Hi,
    I am processing below record set with the help of BULK COLLECT in Oracle PLSQL Procedure. While processing I am checking model is one that need not be substituted. If it is 'NA' or 'N/A', I need process next record (marked as bold in code snipet)
    Please guide me how to do it ?
    TYPE t_get_money IS TABLE OF c_get_money%ROWTYPE INDEX BY BINARY_INTEGER;
    L_money t_get_money ;
    L_subst_model VARCHAR2(40);
    L_Notify_Manager     VARCHAR2(1);
    L_grade          VARCHAR2(20);
    L_Error_Message     VARCHAR2(1);
    BEGIN
    OPEN c_get_money ;
    FETCH c_get_money BULK COLLECT INTO L_money ;
    CLOSE c_get_money;
    FOR I IN 1..L_money.count LOOP
    -- check if the model is one that need not be substituted
    IF (upper(L_money(i). subst_model) in ('N/A', 'NA')
    THEN
    L_NOTIFY_MANAGER(I) := 'Y';
    L_GRADE(I) := 'ERROR';
    L_error_message(i) := 'substitute Model is not N/A or NA' ;
    -------Here I want to process NEXT RECORD--------
    END IF ;
    END;

    One of the solution for below version of 11g...
    DECLARE
         TYPE t_get_money IS TABLE OF c_get_money%ROWTYPE
                                       INDEX BY BINARY_INTEGER;
         L_money              t_get_money;
         L_subst_model        VARCHAR2 (40);
         L_Notify_Manager   VARCHAR2 (1);
         L_grade              VARCHAR2 (20);
         L_Error_Message    VARCHAR2 (1);
    BEGIN
         OPEN c_get_money;
         FETCH c_get_money
         BULK COLLECT INTO L_money;
         CLOSE c_get_money;
         FOR I IN 1 .. L_money.COUNT LOOP
              IF UPPER (L_money (i).subst_model) IN ('N/A', 'NA') THEN
                   GOTO Nextrecord;
              END IF;
              L_NOTIFY_MANAGER (I)   := 'Y';
              L_GRADE (I)              := 'ERROR';
              L_error_message (i)    := 'substitute Model is not N/A or NA';
            <<Nextrecord>>
              NULL;
         END LOOP;
    END;One of the solution for 11gR1 and above...
    DECLARE
         TYPE t_get_money IS TABLE OF c_get_money%ROWTYPE
                                       INDEX BY BINARY_INTEGER;
         L_money              t_get_money;
         L_subst_model        VARCHAR2 (40);
         L_Notify_Manager   VARCHAR2 (1);
         L_grade              VARCHAR2 (20);
         L_Error_Message    VARCHAR2 (1);
    BEGIN
         OPEN c_get_money;
         FETCH c_get_money
         BULK COLLECT INTO L_money;
         CLOSE c_get_money;
         FOR I IN 1 .. L_money.COUNT LOOP
              IF UPPER (L_money (i).subst_model) IN ('N/A', 'NA') THEN
                   CONTINUE;
              END IF;
              L_NOTIFY_MANAGER (I)   := 'Y';
              L_GRADE (I)              := 'ERROR';
              L_error_message (i)    := 'substitute Model is not N/A or NA';
         END LOOP;
    END;

  • How to process million records

    Hi ,
    How would you process 50 million records without running out of set background time in BDC . please help is there any other process for doing this.
    Moderator message: too vague, help not possible, please describe problems in all technical detail when posting again.
    [Asking Good Questions in the Forums to get Good Answers|/people/rob.burbank/blog/2010/05/12/asking-good-questions-in-the-forums-to-get-good-answers]
    Edited by: Thomas Zloch on Dec 9, 2010 9:40 AM

    Hi,
    I am not sure but please check below given link might be it will useful for you.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0d1e9c4-bfb4-2c10-76b4-d5e2912b83be
    Thanks,
    Jaten Sangal

  • Sender JMS Content Conversion - How to process multiple records

    Hi All,
    I use a Sender JMS Channel with Content Conversion.
    My message structure is like this
    <root>
        <rec>    </rec>
        <rec>    </rec>
    </root>
    I have fixed length flat file with multiple records.
    i have given the parameters FixedFieldLength, FieldNames and StructureTitle.
    Which parameter i need to use specify the RecordDelimiter
    Because my input file will have more than record
    my input file -
    xxxx
    yyyy
    if i dont specify any delimiter value, in the module parameter,then for each newline of the file, a new mesage is created.
    <root>
      <rec>xxxx</rec>
    <root>
    <root>
      <rec>yyyy</rec>
    <root>
    But i want the output to be like this
    <root>
    <rec>xxxx<rec>
    <rec>yyyy</rec>
    </root>

    hi,
    You can do your FCC for sender JMS by going through page 5 of this document.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50061bd9-e56e-2910-3495-c5faa652b710

  • How to tace erronious records in a mapping or a process flow?

    Hi All,
    I read the following document by Rittman.
    http://www.rittman.net/work_stuff/tracing_owb_mappings_pt1.htm
    I am using Oracle Warehouse Builder 10G R1.
    But I feel, it may solve my problem. My problem scenario is as follows.
    Here I would like to know how to trace the records which are valid as per business rules, but not counted in the output due to some functional errors, as follows.
    For example a variable contains value Region = "R01".
    So as per the rule, we need to retrieve the number 01.
    I impelemented as to_number ( substr (Region,2 ) )
    Unfortunately, In one record, I got the field data as "RRR".
    So as per the rule , if apply that logic, this will return the error/warning.
    So this record is not counted in the output.
    Here I would like to trace these type of records in a table or a file while executing the Mapping.
    Is it possible using Oracle Warehouse Builder or Oracle?
    When we are dealing external table we can create log or bad file, which will hold all bad records by default. Is there any way to do this in a mapping?
    Is any one implemented these kind of tracing files which contains all bad records.
    Any suggestions are welcome.
    Thank you,
    Regards,
    Gowtham Sen.

    Hi,
    i have never used this before but i know that inside the mapping configuration in the table operators there is a property where you can specify in the constraints management the exceptions table name. Anyway you might add an additional field to your target table, add a case expression and then mark the field as being valid or invalid or something like that. You can then select the which records are invalid.
    Take a look at this thread: Some Thoughts On An OWB Performance/Testing Framework
    Re: Some Thoughts On An OWB Performance/Testing Framework
    Cheers,
    Ricardo

  • How to edit the records in error stock.

    Hi Experts,
                 i have error records in error stack and the remaining records are loaded successfuly . here my doubt is how to edit the records in error stack because its not giving the edit option .
    i want to get the edit option means i need to delete the request in target r wt ?
    and there are two more targets is there below this process .
    Advance Thanks.
    Regards
    SAP

    HI
    If you have less number records in this request(which you extracted now), delete that request from all the targets and reload again with option Valid records update, reporting not possible.
    this is the option which is recommended.
    follow the below 2 docs as well.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80dd67b9-caa0-2e10-bc95-c644cd119f46?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/007f1167-3e64-2e10-a798-e1ea456ef21b?QuickLink=index&overridelayout=true
    Regards,
    Venkatesh
    Edited by: Venkateswarlu Nandimandalam on Jan 24, 2012 4:35 PM

  • How to proces the record in Table with multiple threads using Pl/Sql & Java

    I have a table containing millions of records in it; and numbers of records also keep on increasing because of a high speed process populating this table.
    I want to process this table using multiple threads of java. But the condition is that each records should process only once by any of the thread. And after processing I need to delete that record from the table.
    Here is what I am thinking. I will put the code to process the records in PL/SQL procedure and call it by multiple threads of Java to make the processing concurrent.
    Java Thread.1 }
    Java Thread.2 }
    .....................} -------------> PL/SQL Procedure to process and delete Records ------> <<<Table >>>
    Java Thread.n }
    But the problem is how can I restrict a record not to pick by another thread while processing(So it should not processed multiple times) ?
    I am very much familiar with PL/SQL code. Only issue I am facing is How to fetch/process/delete the record only once.
    I can change the structure of table to add any new column if needed.
    Thanks in advance.
    Edited by: abhisheak123 on Aug 2, 2009 11:29 PM

    Check if you can use the bucket logic in your PLSQL code..
    By bucket I mean if you can make multiple buckets of your data to be processed so that each bucket contains the different rows and then call the PLSQL process in parallel.
    Lets say there is a column create_date and processed_flag in your table.
    Your PLSQL code should take 2 parameters start_date and end_date.
    Now if you want to process data say between 01-Jan to 06-Jan, a wrapper program should first create 6 buckets each of one day and then call PLSQL proc in parallel for these 6 different buckets.
    Regards
    Arun

  • How to process pdf file in clower ETL

    Hi,
    I want process pdf document in clower ETL dataintegartor. I have created sample project and created ETL garph universal data reader, data i have imported pdf file, while openning the metta data information it's show encoding data format and invalid delimiter and while running error in the console
    Please assist me how to process pdf file with unstructured data format.
    I am getting below the error,
    ERROR [WatchDog] - Graph execution finished with error
    ERROR [WatchDog] - Node DATA_READER0 finished with status: ERROR caused by: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
    ERROR [WatchDog] - Node DATA_READER0 error details:
    org.jetel.exception.BadDataFormatException: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
         at org.jetel.data.parser.DataParser.parsingErrorFound(DataParser.java:527)
         at org.jetel.data.parser.DataParser.parseNext(DataParser.java:437)
         at org.jetel.data.parser.DataParser.getNext(DataParser.java:168)
         at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:415)
         at org.jetel.component.DataReader.execute(DataReader.java:261)
         at org.jetel.graph.Node.run(Node.java:425)
         at java.lang.Thread.run(Thread.java:619)
    please can any one help me.
    Thanks
    Rajini C
    Edited by: 954486 on Sep 19, 2012 11:19 PM

    There is a separate forum for the BI/Information Discovery application of Endeca software: Endeca Information Discovery You should post your message there.
    Thanks.
    Sean

  • How To Process an XML File

    Hi All
    Clasic ASP
    MySQL
    Win 2K Server
    I have an XML file that is uploaded from clients via a
    browser which
    contains job data. Once uploaded I need to process it into a
    MySQL DB using
    classic ASP.
    The file contains 3 lots of elements under the main job
    element; these are:-
    <job_detail>
    <engineers>
    <materials>
    The upload and save aspect I can do no problem what I need to
    do is read the
    file and process each job detail record, each engineer detail
    record and
    each materials record into a database, something on the lines
    of:-
    Upload file from client
    Save file to disk
    Open XML file
    Select all job records
    Loop through and process job records
    Select all engineer records
    Loop through and process engineer records
    Select all material records
    Loop through and process material records
    How do I select the job detail, engineer and materials
    records, into say an
    array or a recordset for example. (if someone feels there is
    a better way
    then please feel free to suggest.)?
    An example of my XML file is below
    TIA
    Bren
    <?xml version="1.0" ?>
    - <jobs>
    - <job_detail>
    <j_recid>4041</j_recid>
    <j_numofvisits>1</j_numofvisits>
    <j_client>800</j_client>
    <j_site>864</j_site>
    <j_workdone />
    <j_suppliervisit>N</j_suppliervisit>
    <j_furtherwork>N</j_furtherwork>
    <j_returnvisit>N</j_returnvisit>
    <j_furtherworksdesc />
    <j_tobequoted>N</j_tobequoted>
    <j_status>6</j_status>
    <j_enteredby>Yolanda Baker</j_enteredby>
    <j_e_signature>53,29,-1,-1,53,29,53,30,53,30,58,33,58,33,75,45,75,45,110,58,110,58,149,61 ,149,61,176,54,176,54,186,42,186,42,191,33,191,33,192,30,192,30,192,29,192,29,189,29,189,2 9,184,28,184,28,174,29,174,29,159,32,159,32,124,39,124,39,82,46,82,46,54,50,54,50,43,50,43 ,50,38,50,38,50,37,49,37,49,38,49,38,49,39,49,39,49,57,49,57,49,109,54,109,54,180,56,180,5 6,239,54,239,54,277,46,277,46,281,45,281,45,281,44</j_e_signature>
    <j_e_name>Test Engineer</j_e_name>
    <j_e_position>Engineer</j_e_position>
    <j_e_comments />
    <j_e_sigdate>20/11/2008</j_e_sigdate>
    <j_c_signature>54,7,-1,-1,54,7,54,8,54,8,54,10,54,10,57,19,57,19,65,30,65,30,81,46,81,46, 100,58,100,58,121,62,121,62,144,63,144,63,163,57,163,57,181,47,181,47,190,41,190,41,200,36 ,200,36,207,33,207,33,207,31,207,31,201,29,201,29,197,27,197,27,193,27,193,27,184,28,184,2 8,172,31,172,31,160,34,160,34,146,40,146,40,140,43,140,43,138,45,138,45,137,45,137,45,136, 45,136,45,135,46,135,46,131,50,131,50,127,56,127,56,124,58,124,58,124,57,124,57,124,56,124 ,56,132,54,132,54,151,50,151,50,178,44,178,44,204,39,204,39,220,39,220,39,228,40,228,40,23 0,41,230,41,233,44,233,44,237,49,237,49,246,57,246,57,254,60,254,60,263,63,263,63,264,63</ j_c_signature>
    <j_c_name>Mr Blobby</j_c_name>
    <j_c_position>Fat Chap</j_c_position>
    <j_c_comments />
    <j_c_sigdate>20/11/2008</j_c_sigdate>
    <j_qos>3</j_qos>
    <j_notified>N</j_notified>
    <j_processed>N</j_processed>
    <j_active>Y</j_active>
    </job_detail>
    - <job_detail>
    <j_recid>4042</j_recid>
    <j_numofvisits>1</j_numofvisits>
    <j_client>798</j_client>
    <j_site>865</j_site>
    <j_workdone>As per job sheet</j_workdone>
    <j_suppliervisit>N</j_suppliervisit>
    <j_furtherwork>N</j_furtherwork>
    <j_returnvisit>N</j_returnvisit>
    <j_furtherworksdesc />
    <j_tobequoted>N</j_tobequoted>
    <j_status>6</j_status>
    <j_enteredby>Yolanda Baker</j_enteredby>
    <j_e_signature>44,20,-1,-1,44,20,43,20,43,20,42,20,42,20,39,21,39,21,34,25,34,25,28,29,28 ,29,25,32,25,32,22,38,22,38,22,41,22,41,22,43,22,43,22,44,22,44,26,44,26,44,38,45,38,45,64 ,42,64,42,104,36,104,36,135,33,135,33,165,31,165,31,178,31,178,31,185,33,185,33,189,35,189 ,35,192,38,192,38,197,41,197,41,203,43,203,43,211,49,211,49,220,51,220,51,228,52,228,52,23 1,53,231,53,232,49,232,49,233,43,233,43,230,34,230,34,229,31,229,31,224,27,224,27,213,24,2 13,24,195,23,195,23,171,27,171,27,152,32,152,32,141,36,141,36,138,36,138,36,137,37,137,37, 140,38,140,38,157,43,157,43,191,46,191,46,230,48,230,48,242,48</j_e_signature>
    <j_e_name>Test Engineer</j_e_name>
    <j_e_position>Engineer</j_e_position>
    <j_e_comments>Great job</j_e_comments>
    <j_e_sigdate>20/11/2008</j_e_sigdate>
    <j_c_signature>31,13,-1,-1,31,13,31,14,31,14,31,16,31,16,34,23,34,23,36,32,36,32,38,42,38 ,42,42,53,42,53,43,59,43,59,44,60,44,60,44,59,44,59,44,57,44,57,45,52,45,52,50,44,50,44,56 ,38,56,38,64,33,64,33,74,28,74,28,86,23,86,23,98,21,98,21,111,20,111,20,121,20,121,20,123, 20,123,20,123,21,123,21,126,24,126,24,126,27,126,27,130,33,130,33,133,37,133,37,135,42,135 ,42,139,47,139,47,142,51,142,51,144,54,144,54,151,57,151,57,157,57,157,57,164,52,164,52,17 1,46,171,46,180,42,180,42,193,38,193,38,203,38,203,38,215,38,215,38,228,38,228,38,238,39,2 38,39,245,39,245,39,247,39,247,39,248,38</j_c_signature>
    <j_c_name>Mr Blobby 2</j_c_name>
    <j_c_position>Fat Chap 2</j_c_position>
    <j_c_comments>Very well done</j_c_comments>
    <j_c_sigdate>20/11/2008</j_c_sigdate>
    <j_qos>1</j_qos>
    <j_notified>N</j_notified>
    <j_processed>N</j_processed>
    <j_active>Y</j_active>
    </job_detail>
    - <engineers>
    <m_recid>1</m_recid>
    <m_jid>4041</m_jid>
    <m_operative>18</m_operative>
    <m_starttime>07:00</m_starttime>
    <m_finishtime>17:00</m_finishtime>
    <m_traveltime>1</m_traveltime>
    <m_jobdate>20/11/2008</m_jobdate>
    </engineers>
    - <engineers>
    <m_recid>2</m_recid>
    <m_jid>4041</m_jid>
    <m_operative>3</m_operative>
    <m_starttime>07:00</m_starttime>
    <m_finishtime>17:00</m_finishtime>
    <m_traveltime>1</m_traveltime>
    <m_jobdate>20/11/2008</m_jobdate>
    </engineers>
    - <engineers>
    <m_recid>3</m_recid>
    <m_jid>4042</m_jid>
    <m_operative>3</m_operative>
    <m_starttime>07:00</m_starttime>
    <m_finishtime>17:00</m_finishtime>
    <m_traveltime>1</m_traveltime>
    <m_jobdate>20/11/2008</m_jobdate>
    </engineers>
    - <engineers>
    <m_recid>4</m_recid>
    <m_jid>4042</m_jid>
    <m_operative>25</m_operative>
    <m_starttime>09:00</m_starttime>
    <m_finishtime>17:00</m_finishtime>
    <m_traveltime>1</m_traveltime>
    <m_jobdate>20/11/2008</m_jobdate>
    </engineers>
    - <engineers>
    <m_recid>5</m_recid>
    <m_jid>4042</m_jid>
    <m_operative>8</m_operative>
    <m_starttime>07:00</m_starttime>
    <m_finishtime>17:00</m_finishtime>
    <m_traveltime>1</m_traveltime>
    <m_jobdate>20/11/2008</m_jobdate>
    </engineers>
    - <materials>
    <mu_recid>1</mu_recid>
    <mu_jid>4041</mu_jid>
    <mu_qty>1</mu_qty>
    <mu_description>Widget</mu_description>
    <mu_location>Van</mu_location>
    </materials>
    - <materials>
    <mu_recid>2</mu_recid>
    <mu_jid>4042</mu_jid>
    <mu_qty>20</mu_qty>
    <mu_description>2.5 T & E</mu_description>
    <mu_location>Van</mu_location>
    </materials>
    </jobs>

    And there was much rejoiceing Yayyy! :-)
    I have sorted it, if anyone is interested then see my code
    below. The code
    below writes to the screen for testing purposes. To insert
    into a DB then
    just build your SQL statement and Insert or Update DB as you
    would normally
    instead of writing to screen.
    Code
    <%
    Dim objXML
    Set objXML = Server.CreateObject("Microsoft.XMLDOM")
    objXML.async = False
    objXML.load
    (Server.MapPath("/mwsclient/files/upload/Test_Engineer.xml"))
    If objXML.parseerror.errorCode <> 0 Then
    Response.Write(objXML.parseError.reason)
    Else
    Set objNodeList =
    objXML.documentElement.selectNodes("job_detail")
    For i = 0 To (objNodeList.length - 1)
    Response.Write("JOB RECORD<br>")
    For x = 0 To (objNodeList.Item(i).childNodes.length - 1)
    Response.Write(objNodeList.Item(i).childNodes(x).nodeName
    Response.Write(objNodeList.Item(i).childNodes(x).xml &
    "<br>")
    Next
    Response.Write("<br>")
    Next
    End If
    Set objXML = Nothing
    %>
    Output:
    JOB RECORD
    j_recid - 4041
    j_numofvisits - 1
    j_client - 800
    j_site - 864
    j_workdone -
    j_suppliervisit - N
    j_furtherwork - N
    j_returnvisit - N
    j_furtherworksdesc -
    j_tobequoted - N
    j_status - 6
    j_enteredby - Yolanda Baker
    j_e_signature - 44
    j_e_name - Test Engineer
    j_e_position - Engineer
    j_e_comments -
    j_e_sigdate - 20/11/2008
    j_c_signature - 63
    j_c_name - Mr Blobby
    j_c_position - Fat Chap
    j_c_comments -
    j_c_sigdate - 20/11/2008
    j_qos - 3
    j_notified - N
    j_processed - N
    j_active - Y
    JOB RECORD
    j_recid - 4042
    j_numofvisits - 1
    j_client - 798
    j_site - 865
    j_workdone - As per job sheet
    j_suppliervisit - N
    j_furtherwork - N
    j_returnvisit - N
    j_furtherworksdesc -
    j_tobequoted - N
    j_status - 6
    j_enteredby - Yolanda Baker
    j_e_signature - 48
    j_e_name - Test Engineer
    j_e_position - Engineer
    j_e_comments - Great job
    j_e_sigdate - 20/11/2008
    j_c_signature - 38
    j_c_name - Mr Blobby 2
    j_c_position - Fat Chap 2
    j_c_comments - Very well done
    j_c_sigdate - 20/11/2008
    j_qos - 1
    j_notified - N
    j_processed - N
    j_active - Y

  • How to process the Received Idoc in SAP R/3 ? What to be done ?

    Hi All
    I am working for file to Idoc scenario.....
    Idoc received into SAP R/3 but how to process the Idoc data ?
    Such it will store in SAP R/3 DB.....
    Clearly
    How to Process the received idoc data into SAP R/3 ? (this is for inbound idoc)
    I hope any one can help me on the processing steps ?
    Waiting for valuable inputs form experts
    Regards
    Rakesh

    rakesh
    chec <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cdded790-0201-0010-6db8-beb9bb2b2660">Sample IDoc</a>
    normally, based on the idoc types it will get processed. if it is an idoc with master record it will create appropriate master records or if it based for a transaction it will create one.
    <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/d19fe210-0d01-0010-4094-a6fba344e098">https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/d19fe210-0d01-0010-4094-a6fba344e098</a>

  • How to update old records of LIPS used user exit MV50AFZ1

    To All Experts,
    I have used User Exit MV50AFZ1 and in this user exit i updated the fields
    USEREXIT_MOVE_FIELD_TO_LIPS.
    LIPS-ETENR = VBEP-ETENR.
    Its working fine for new VL0N1 t-code, but what about old recods of LIPS table ?
    How to update old records ? Pl. guide me.
    Yusuf

    Hi Yusuf,
    See SAP Note 415716 - User exits in delivery processing, it explain how these userexits work and cautions that you must have.
    Regards
    Eduardo

  • How to process data in the past based from data in the present

    hello guys,
    i have a problem in my labview programs. how to process data in the past based from data in the present ?
    i have a formula self-organizing maps
    this formula is looking for D1, D1 is neuron index that will be searched for the smallest value.and the result are D1=2 ,D2=5, D3=17 from calculating with formula  .it means the smallest value is 2, "2" from weight [2 2] in file attached.
    and then it will be in other formula
    it mean [2 2] + 0.5 ( [1 1]-[2 2] ) = [1.5 1.5]
    and the weight will be  [1.5 2 2 ] in matrix
                                              1.5 3 5
    I would appreciate any input/help on solving this
    thanks
    Attachments:
    dika.vi ‏16 KB
    weight.txt ‏1 KB
    data .txt ‏1 KB

    Hi Ronny Hanks,
    Moving your records from internal table into the database table depends upon various scenarios :-
    1. If you use INSERT statement.
    INSERT <database_table> FROM TABLE <internal_table>.
    But in this case, you need to make sure that you don't have any duplicate entries in your internal table that violates data entry into database table, else you will get a dump.
    INSERT <database_table> FROM TABLE <internal_table> ACCEPTING DUPLICATE KEYS.
    In this case, you are forcefully inserting duplicate records into your database table which may lead to data redundancy in your database table.
    2. If you use UPDATE statement.
    UPDATE <database_table> FROM TABLE <internal_table>.
    This will update the existing records in your database table from the internal table.
    3. If you use MODIFY statement.
    MODIFY <database_table> FROM TABLE <internal_table>.
    This statement works both in combination of INSERT & UPDATE statements.
    Existing records (in database table) will be eventually updated/modified and new records (not in database table currently) will be successfully inserted into the database table.
    Hope this solves your problem.
    Thanks & Regards.
    Tarun Gambhir.

  • Trap error within loop and process next record

    Hi,
    I am processing each record inside a loop. Now if any exception occurs processing with a single record within loop I want to continue with the next record with proper error message in the log.
    How to achieve the above scenario? Shall I create a savepoint and whenever any error occurs inside the loop I will rollback to that savepoint. Once it is done shall it process the next record automatically?
    Thanks in advance for your reply.
    Thanks,
    Mrinmoy

    Relational databases are about sets.
    They are not about files and records
    Processing records in a loop will make your code slow, and you should avoid using this strategy.
    That said
    Simply enclose the code in it's own begin end block.
    beginn
    <your code>
    exception
    when <your exception> then
    <process the exception not reraising it>
    end;
    No savepoints required.
    Sybrand Bakker
    Senior Oracle DBA

  • Processing Several Records in a CSV File

    Hello Experts!
    I'm currently using XI to process an incoming CSV file containing Accounts Payable information.  The data from the CSV file is used to call a BAPI in the ECC (BAPI_ACC_DOCUMENT_POST).  Return messages are written to text file.  I'm also using BPM.  So far, I've been able to get everything to work - financial documents are successfully created in the ECC   I also receive the success message in my return text file.
    I am, however, having one small problem...  No matter how many records are in the CSV file, XI only processes the very first record.  So my question is this: Why isn't XI processing all the records?  Do I need a loop in my BPM?  Are there occurrence settings that I'm missing?  I kinda figured XI would simply process each record in the file.
    Also, are there some good examples out there that show me how this is done?
    Thanks a lot, I more than appreciate any help!

    Matthew,
    First let me explain the BPM Steps,
    Recv--->Transformation1->For-Each Block->Transformation2->Synch Call->Container(To append the response from BAPI)->Transformation3--->Send
    Transformation3 and Send must be outside Block.
    Transformation1
    Here, the source and target must be same. I think you must be know to split the messages, if not  see the below example
    Source
    <MT_Input>
    <Records>
    <Field1>Value1</Field1>
    <Field2>Value1</Field2>
    </Records>
    <Records>
    <Field1>Value2</Field1>
    <Field2>Value2</Field2>
    </Records>
    <Records>
    <Field1>Value3</Field1>
    <Field3>Value3</Field3>
    </Records>
    </MT_Input>
    Now , I need to split the messages for each Records, so what I can do?
    In Message Mapping, choose the source and target as same and in the Messages tab, choose the target occurrence as 0..Unbounded.
    Now,if you come to Mapping tab, you can see Messages tag added to your structure, and also you can see <MT_Input> occurrence will be changed to 0..unbounded.
    Here is the logic now
    Map Records to MT_INPUT
    Constant(empty) to Records
    Map rest of the fields directly. Now your o/p looks like
    <Messages>
    <Message1>
    <MT_Input>
    <Records>
    <Field1>Value1</Field1>
    <Field2>Value1</Field2>
    </Records>
    </MT_Input>
    <MT_Input>
    <Records>
    <Field1>Value2</Field1>
    <Field2>Value2</Field2>
    </Records>
    </MT_Input>
    <MT_Input>
    <Records>
    <Field1>Value3</Field1>
    <Field3>Value3</Field3>
    </Records>
    </MT_Input>
    </Message1>
    </Messages>
    raj.

Maybe you are looking for

  • Problems Printing AR Downpayment Invoice + Payment

    Hi Experts, I have searched high and low in the forum but I have not been able to find a solution to my problem. Currently, what I have done is that I have created a print layout with Crystal Report for my AR Downpayment Invoice and have set this to

  • N95 8GB Firmware release 20.0.016

    New firmware release 20.0.016 for N95 8GB is out and installed. The fixes are, an inhansment in the browser and there is a new feature add which is a screen rotate. Congratulation guys.

  • How can i create a more detailed map in aperture?

    How can i create a more detaled map in Aperture- Book?

  • Email 2x size of attachment

    When I check the size of a .mov file in the finder it is 10.7 MB. When I add the file to an email Mail says it is 20 MB and the smtp server rejects the file because it is over the limit. Why does the email double in size?

  • Rending problem - "not found!"

    When I rendered my sequences with FCP 7, at the end of long time as 2 or 3 hours rendering time, the message "Not Found!" appeared and I failed rendering. If I render the same sequences again, the same thing happens. Please let me know why this thing