Screen Field Data Change Error with in Chain/End-Chain request

Hello,
I am currently experiencing an issue that I cannot resolve.  I have two fields in a chain/end-chain request that I need to perform some functions when entry is made.  For example, the two fields are amount and percent_of_change and when either is entered on the screen.  The chain request makes it into the module to adjust the other field however when it the program returns to the chain it will reset the value that was in there before the entry was made.  Here is an example of the coding:
CHAIN.
      FIELD wa_projection-flag.
      FIELD wa_projection-nextclass.
      FIELD wa_projection-raisetype.
      FIELD wa_projection-rank.
      FIELD wa_projection-edate.
      FIELD wa_projection-percent_change
        MODULE calculate_amount ON REQUEST.
      FIELD wa_projection-amount
        MODULE calculate_percent ON REQUEST.
      MODULE group_proj_entr_modify ON CHAIN-REQUEST.
    ENDCHAIN.
MODULE calculate_amount INPUT.
*** Calculate Percentage or amount depending on what was entered.
  IF    wa_projection-currsal > 0.
    wa_projection-amount = wa_projection-currsal * ( wa_projection-percent_change / 100 ).
  ENDIF.
ENDMODULE.               
Has anyone run into this before?  Is so, please advise how I can resolve this issue.
Thanks,
Jereme

Combine both the CHAIN MODULEs in one.
Like:
      FIELD: wa_projection-percent_change,
                 wa_projection-amount
        MODULE calculate_amount_percent ON REQUEST.
Now, in your module calculate_amount_percent, you will have access of both the fields. So, you will be able to perform the calculation and update it.
Regards,
Naimesh Patel

Similar Messages

  • Screen Fields Not getting Updated with DYNP_VALUES_UPDATE

    Hi All,
    I am using FM DYNP_VALUES_READ to read the screen field values and then modifying those values and passing the modified table to DYNP_VALUES_UPDATE. But the problem is screen fields are not populated with these values . The values I wish to populate are footer values . I am using OO ALV for this wherein I am handling an event for filter of records and accordingly I need to modify my footer. Any Suggestions??
    *&      Form  footer
          Caslculate Total items, number of hits/Miss and percent MSP
    -->  p1        text
    <--  p2        text
    FORM footer .
      CLEAR : io_totitems,
              io_tothit,
              io_totmiss,
              io_permiss.
    IF NOT l_flag EQ 1.
    Get total items
      DESCRIBE TABLE i_outdata LINES io_totitems.
      LOOP AT i_outdata INTO wa_outdata.
      calculate number of hits
        IF wa_outdata-hit_miss  = 'HIT'.
          io_tothit = io_tothit + 1.
        ENDIF.
      calculate number of Miss
        IF wa_outdata-hit_miss  = 'MISS'.
          io_totmiss = io_totmiss + 1.
        ENDIF.
        CLEAR wa_outdata.
      ENDLOOP.
    calculate percent MSP
      io_permiss = ( ( io_totitems - io_totmiss ) / io_totitems ) * 100 .
      IF l_flag EQ 1.
        wa_dynpread-fieldname = 'IO_TOTITEMS'.
        APPEND wa_dynpread TO t_dynpread.
        wa_dynpread-fieldname = 'IO_TOTHIT'.
        APPEND wa_dynpread TO t_dynpread.
        wa_dynpread-fieldname = 'IO_TOTMISS'.
        APPEND wa_dynpread TO t_dynpread.
        wa_dynpread-fieldname = 'IO_PERMISS'.
        APPEND wa_dynpread TO t_dynpread.
        CLEAR wa_dynpread.
        d020s-prog = sy-repid.
        d020s-dnum = sy-dynnr.
        CALL FUNCTION 'DYNP_VALUES_READ'
                EXPORTING
                   dyname               = d020s-prog
                   dynumb               = d020s-dnum
                   translate_to_upper   = 'X'
              REQUEST              = ' '
                TABLES
                   dynpfields           = t_dynpread.
        MOVE io_totitems TO io_totitem.
        CONDENSE io_totitem NO-GAPS.
        MOVE io_tothit TO io_tothits.
        CONDENSE io_tothits NO-GAPS.
        MOVE io_totmiss TO io_totmis.
        CONDENSE io_totmis NO-GAPS.
        MOVE io_permiss TO io_permis.
        CONDENSE io_permis NO-GAPS.
        LOOP AT t_dynpread INTO wa_dynpread.
          CASE: wa_dynpread-fieldname.
            WHEN 'IO_TOTITEMS'.
              wa_dynpread-fieldvalue = io_totitem .
              MODIFY t_dynpread FROM wa_dynpread.
            WHEN 'IO_TOTHIT'.
              wa_dynpread-fieldvalue = io_tothits.
              MODIFY t_dynpread FROM wa_dynpread.
            WHEN 'IO_TOTMISS'.
              wa_dynpread-fieldvalue = io_totmis.
              MODIFY t_dynpread FROM wa_dynpread.
            WHEN 'IO_PERMISS'.
              wa_dynpread-fieldvalue =  io_permis .
              MODIFY t_dynpread FROM wa_dynpread.
          ENDCASE.
          CLEAR wa_dynpread.
        ENDLOOP.
        CALL FUNCTION 'DYNP_VALUES_UPDATE'
          EXPORTING
            dyname               = d020s-prog
            dynumb               = d020s-dnum
          TABLES
            dynpfields           = t_dynpread
          EXCEPTIONS
            invalid_abapworkarea = 1
            invalid_dynprofield  = 2
            invalid_dynproname   = 3
            invalid_dynpronummer = 4
            invalid_request      = 5
            no_fielddescription  = 6
            undefind_error       = 7
            OTHERS               = 8.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                  WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ENDIF.
      ENDIF.
    Get all fotter details in a internal table
    this is used to download footer in Excel
      wa_footer-fldname = 'Total Items '.
      wa_footer-value = io_totitems.
      APPEND wa_footer TO i_footer.
      wa_footer-fldname = 'Number of HIT'.
      wa_footer-value = io_tothit.
      APPEND wa_footer TO i_footer.
      wa_footer-fldname = 'Number of MISS'.
      wa_footer-value = io_totmiss.
      APPEND wa_footer TO i_footer.
      wa_footer-fldname = 'Percentage MISS '.
      wa_footer-value = io_permiss.
      APPEND wa_footer TO i_footer.
      CLEAR wa_footer.
    ENDFORM.                    " footer

    data: t_dynfields1 type table of dynpread with header line.
    CALL FUNCTION 'DYNP_VALUES_READ'
    EXPORTING
    dyname = d020s-prog
    dynumb = d020s-dnum
    translate_to_upper = 'X'
    REQUEST = ' '
    TABLES
    dynpfields =  t_dynfields1.
    <b>read table t_dynfields1 index 1</b> “ I hope this is missing
    I will be selecting from a list .
    Because once the value is read we need to update the screen field with that value .
    Here the selected value will be in the first place
    Now
    Your operation is to populate the fields with some variables .
    In the same when I do this there is only one value then
    Im expecting only one entry here …
    LOOP AT t_dynpread INTO wa_dynpread.
    CASE: wa_dynpread-fieldname.
    WHEN 'IO_TOTITEMS'.
    wa_dynpread-fieldvalue = io_totitem .
    MODIFY t_dynpread FROM wa_dynpread.
    WHEN 'IO_TOTHIT'.
    wa_dynpread-fieldvalue = io_tothits.
    MODIFY t_dynpread FROM wa_dynpread.
    WHEN 'IO_TOTMISS'.
    wa_dynpread-fieldvalue = io_totmis.
    MODIFY t_dynpread FROM wa_dynpread.
    WHEN 'IO_PERMISS'.
    wa_dynpread-fieldvalue = io_permis .
    MODIFY t_dynpread FROM wa_dynpread.
    ENDCASE.
    CLEAR wa_dynpread.
    ENDLOOP. */
    Comment this for a while and
    Change the code as
    t_dynfields1-fieldname = ' '. "<--- header name
    t_dynfields1-fieldvalue = ' '."<----val  
    append t_dynfields1.
    t_dynfields1-fieldname = ' '.
    t_dynfields1-fieldvalue = ' '.
    append t_dynfields1.
    t_dynfields1-fieldname = ' '.
    t_dynfields1-fieldvalue = ' '.
    append t_dynfields1.
    And check .
    CALL FUNCTION 'DYNP_VALUES_UPDATE'
    EXPORTING
    dyname = d020s-prog
    dynumb = d020s-dnum
    TABLES
    dynpfields = t_dynfields1.
    regards,
    vijay

  • HR master data change export with Interface-Toolbox PU12

    Hello out there !
    I implemented a HR master data change export with PU12 at one of my customers.
    It works fine - with one little problem:
    It is only possible to export one period (actual or selected).
    Is there any possibility to export more than one period (also periods in the future) ?
    The request is to export also master data with validity start date in a future period.
    Thanks for any help.
    Greetings, Holger Mächtig

    Dear Holger,
    I am very curious about your problem last year. Is it possible to do a future export with PU12?
    I am now working with PU12 for the first time and I have set up a export file in PU12.
    But when I am doing an update without filling the name for the file layout I have the following error:
    "An error occurred when opening the "export file"
    When I am filling the file layout I have the following error:
    "An error occurred when writing to an export file" (Error Number E107).
    "File processing: end of file".
    When I am not doing an update the export file runs and seems right.
    Do you know what I am doing wrong?
    Thanks in advance!
    Kind regards,
    Yvette

  • Including an Event Data Change in a Process Chain

    There is a step called "_Execution with Data Change in InfoProvider_" in the process of including event data change in a process chain.
    How do we carryout this? should this be done in Bex or in process chain itself?

    Dear Eshwari,
    1) To get the 'Execution on data change' option, you need to have aprocess chain with the 'Event data
    change' process type which contains 'the infoprovider' (The infoprovider, over which the query is created
    which you are trying to broadcast) in the process variant of it.
    2) Further more, additional authorization is also needed in order for the user to see such scheduling options in broadcaster.
    Please assign this authorization object to the affected users.
    BEx Broadcasting Authorization to Schedule  S_RS_BCS
         Activity                       *
         Event ID in Broadcasting Frame *
         Event Type in Broadcasting Fra <== Here you should select*
         ID of a BI Reporting Object in *
         Type of BI Reporting Object in *
    3) Please also ensure the user has not only the Busines Explorer role but also Business Intelligence role in Portal,
    Regards,
    Arvind

  • Re: Changes to screen field date was not updated

    Hi,
    I had a dialog screen showing a valid from date as 10.10.2007. I changed it to 15.10.2007 and click on other details tab. when I returned, it is still showing the old values.
    When I did a DEBUG, upon changing and hitting enter key, the new dates for the screen field is not changed. it is still showing the old date for the screen field.
    why is it so?
    I think this is the cause of why the old values is still displaying. How can I get the field updated to my internal table. It need to be filled with the new values before I can updated to my internal table and then re-displayed again.
    How to get the new values into the screen field?

    Hi..
    As i understatnd this field is in TABLE Control.
    In that case you must Update the Internal table in the PAI .. Inside LOOP .. ENDLOOP.
    Process After Input.
      LOOP AT ITAB.
       Module Update_itab.  "Calling the module
      ENDLOOP.
    <b>In Module pool</b>
       Module Update_itab INPUT. 
        modify itab from wa index TABCON-CURRENT_LINE.
       ENDMODULE.
    Note: Here WA is the work area of the Screen fields and TABCON is the Table control variable.
    <b>Reward if Helpful.</b>

  • Changing the Label of the LDB - PNP Screen Field(Date field)

    Hi,
    I got a req. to change the label of the field "Data Selection Period".  This field is from PNP LDB.
    Pls let me know, is there any FM or any other process?

    you can customize the PNP selection screen by creating a report class. you will find the button in the report attributes.
    or you can define your own selection view via SE80 an maintaining table T599C.
    but I dont't think you will be able to rename the field lable.

  • Report data binding error with date values

    I have CF7.02 with a Microsoft Visual FoxPro 9.0 SP1 Database
    that I connect to using ODBC (FoxPro Driver 6.01.8630.01). I send
    my sql results to a CF Report Builder 7.02 PDF report and it works
    fine. If I dump the date values before I change them, they look
    like the following: {ts '2004-12-20 00:00:00'} . However I have
    tried a number of ways of manipulating the date before sending it
    to the report, but I continue to get errors. I don't care what
    format they go to the report in, since the report reformats them
    anyway. I checked to make sure that none of the dates were null.
    For example <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDateTime(#ldCFPrcInputDate#) /> yields values in the
    {ts '2004-12-20 00:00:00'} format when I dump the query results to
    screen. Where:
    ldCFPrcInputDate = 12/20/2004
    CreateODBCDateTime(ldCFPrcInputDate) = {ts '2004-12-20
    00:00:00'}
    ReportQuery.PrcInputDate[lcCurRow] = {ts '2004-12-20
    00:00:00'}
    I get the error:
    Report data binding error Unable to get value for field
    'prcinputdate' of class 'java.util.Date'.
    coldfusion.runtime.OleDateTime -> Date
    Not using the CreateODBCDate function for example <cfset
    ReportQuery.PrcInputDate[lcCurRow] = #ldCFPrcInputDate# /> I
    get:
    Report data binding error Unable to get value for field
    'prcinputdate' of class 'java.util.Date'.
    java.lang.String -> Date
    Here are some of my failed attempts:
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDateTime(#ldCFPrcInputDate#) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    #ldCFPrcInputDate# />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    DateFormat(CreateODBCDate(#ldCFPrcInputDate#),'mm/dd/yyyy') />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate(DateFormat(#ldCFPrcInputDate#,'mm/dd/yyyy') />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate(DateFormat(#ldCFPrcInputDate#,'mm-dd-yyyy') />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    Trim(createODBCDateTime(ldtmpdate)) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate({05-07-2006}) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate('{05-07-2006}') />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate(05/07/2006) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate('05/07/2006')/>
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    CreateODBCDate(parseDateTime('05/07/2006')) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    parseDateTime(CreateODBCDateTime(05/07/2006)) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] =
    parseDateTime(CreateODBCDateTime('#ldCFPrcInputDate#')) />
    <cfset ReportQuery.PrcInputDate[lcCurRow] = '{05-07-2006}'
    />
    <cfset ReportQuery.PrcInputDate[lcCurRow] = '{05/07/2006}'
    />
    <cfset ReportQuery.PrcInputDate[lcCurRow] = '05/07/2006'
    />
    I also did some googling without success. Any help is
    appreciated.

    You may want to make sure that the column is not included in
    the query variable list in your report cfr file.
    The newest version of the Report Builder, which may be
    installed on the server, no longer tolerates vars appearing in that
    list of they are not in the query itselft.

  • RF Screen Field-Data Type

    Hi,
       Please let me know if the Data Type- STRG (Char string,variable length) is allowed for RF Screen.
    I want to use a field with Defined Len = 1000 CHAR & visible Len = 1.
    Pls let me know if you hv some other solution also...
    Regards,
    Rushikesh

    I'm not familiar with module pool programs and their impossibilities ??
    but I guess the transfer of data or retrieving of data is the same whether youi have a selection screen or not.
    in the selection screen all data is gathered in a range. you can also create and fill a range you're self
    look at datatype ranges for more info
    example
    ranges: h_arbid for crhd-objid occurs 0 .
    you can now define
    h_arbid-option = 'I'.
    h_arbid-sign = 'BT'.   "means between or EQ for single value
    h_arbid-low = 'start 1'.
    h_arbid-high = 'start 10'.
    append h_arbid.
    you can also add more ranges to the same selection.
    h_arbid-option = 'I'.
    h_arbid-sign = 'BT'.   "means between or EQ for single value
    h_arbid-low = 'start 20'.
    h_arbid-high = 'start 30'.
    if you pass it on to you're select statement it is
    select * from table where arbid IN h_arbid.
    so you get all the records from 1 till 10 and 20 till 30
    kind regards
    arthur

  • Flat file data loading error using process chain

    Hi SAP Experts,
    I am having a problem loading the flat file data to the cube using process chain. The issue is that when i run the process chain it fails giving the message " Date format error, please enter the date in the format  _.yyyy" . I am using " 0calmonth in the datasource" . Strange is that when i manually execute the infopackage, i dont get any errors and am able to load the same file successfully to the dso and the cube. Is there any special setting for the process chain that i am missing?
    The date format in the flat file is mm/yyyy. I have tried all the options i could including recreating the datasource and invain dont see any success so far. please help me solving this problem as we r in the middle of testing cycle.
    Thanking you all for your quick response and support all the time.
    Kind Regards,
    Sanjeev

    Hi Sanjeev,
    I believe you are opening the .csv file again after saving it. In this case the initial 0 in single digit month (say 02/2010) is getting changed to 2/2010 and the resulting file is not readable to the system. Just do not open the file after you have entered data in the .csv file and saved and closed it. If required, open the file in notepad, but not in excel in case you want to re-check the data.
    Hope this helps.
    regards,
    biplab

  • Data load Error using Process chain

    Hi,
    In NW2004s when I schedule delta loads for 2LIS_11_VAITM using process chains, when there is zero delta, the manage screens of the infocubes show green light with zero records correctly whereas the process chain log display shows red. The details of the process monitor show zero records. Is this right? I assume that even if there is zero delta, the log must show green and all the subsequent processes like activation of ODS, deleting of PSA, construction of indexes must be carried out. Please confirm. Is this a bug?
    Thanks.
    Param

    Hi Victor,
    Can you please advice how you solved issue as we are having similar issue where it works fine when loaded manually but throws communication error when process chain runs, following is error message..
    Communication error: call FM RSSDK_DATA_REMOTE_GET
    Thanks in advance for your help!!
    Sandeep

  • Report Data Binding Error with CFREPORT

    I'm getting this error on my production server but not my
    testing server when I try to print a report:
    Report data binding error Unknown column name :
    PK_RegistrationID.
    The thing is, the column PK_RegistrationID is not in the
    query I'm passing to the cfreport file. And it's not in the
    cfreport fiile itself. It is used in other cfreport files in my
    application. I'm wondering if this has anything to do with the
    version of cfreport on the production server.

    You may want to make sure that the column is not included in
    the query variable list in your report cfr file.
    The newest version of the Report Builder, which may be
    installed on the server, no longer tolerates vars appearing in that
    list of they are not in the query itselft.

  • Bit stumped; data overflow error with DATETIME vs DATE or DATETIME2

    I find myself in a slightly perplexing situation. In trying to replicate data to a SQLServer 2008 database I have no problems doing so for a date column on the Oracle side to either a DATE or DATETIME2 datatype on the SQLServer side. However, upon trying a DATETIME column I'm given the errors below. Essentially a -2147217887 but Goldengate marks it as a data overflow error. The thing is, a datetime2 is more like a TIMESTAMP column in Oracle and the DATETIME is essentially a DATE. Why it would work with a DATE (less precise) or DATETIME2 (more precise) yet not a DATETIME (same precision) is a bit of a head scratcher. The same defs file is used for each of the options.
    Before anyone suggests using either destination datatype that works, I've no choice; it has to be a DATETIME column. The customer is always right, even when they are infuriatingly wrong.
    Anyone seen this before or have any suggestions?
    Thanks very much in advance!!
    Cheers,
    Chris
    trace
    10:55:36.538 (366244) * --- entering READ_EXTRACT_RECORD --- *
    10:55:36.538 (366244) exited READ_EXTRACT_RECORD (stat=0, seqno=-1, rba=-1156485006)
    10:55:36.538 (366244) processing record for QA1_DW_MS_MAY04.LIEN
    10:55:36.538 (366244) mapping record
    10:55:36.538 (366244) entering perform_sql_statements (normal)
    10:55:36.538 (366244) entering execute_statement (op_type=5,AWO_CUBE.LIEN)
    10:55:36.599 (366305) executed stmt (sql_err=-2147217887)
    10:55:36.599 (366305) exited perform_sql_statements (sql_err=-2147217887,recs output=6018)
    10:55:36.599 (366305) aborting grouped transaction
    10:55:36.619 (366325) aborted grouped transaction
    10:55:36.619 (366325) committing work
    10:55:36.619 (366325) Successfully committed transaction, status = 0
    10:55:36.619 (366325) work committed
    10:55:36.619 (366325) writing checkpoint
    10:55:36.619 (366325) * --- entering READ_EXTRACT_RECORD --- *
    10:55:36.619 (366325) exited READ_EXTRACT_RECORD (stat=400, seqno=-1, rba=-1156490736)
    ggserr.log:
    2012-06-02 10:55:36 WARNING OGG-00869 Oracle GoldenGate Delivery for ODBC, lien.prm: Parameter #: 1 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 2 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 3 Data Type: 129 DB Part: 7 Length: 5 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable Parameter #: 4 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 5 Data Type: 129 DB Part: 7 Length: 8 Max Length: 56 Status: 8 Precision: 56 Scale: 0 Unavailable Parameter #: 6 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 7 Data Type: 129 DB Part: 7 Length: 9 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable Parameter #: 8 Data Type: 129 DB Part: 7 Length: 8 Max Length: 15 Status: 8 Precision: 15 Scale: 0 Unavailable Parameter #: 9 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable Parameter #: 10 Data Type: 129 DB Part: 5 Length: 5 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 11 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 6 Precision: 23 Scale: 3 Data Overflow Parameter #: 12 Data Type: 129 DB Part: 7 Length: 13 Max Length: 512 Status: 8 Precision: 0 Scale: 0 Unavailable Parameter #: 13 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable Parameter #: 14 Data Type: 129 DB Part: 7 Length: 1 Max Length: 1 Status: 8 Precision: 1 Scale: 0 Unavailable Native Error: 0, 0 State: 0, 22007 Class: 0 Source: Line Number: 0 Description: Invalid date format.
    2012-06-02 10:55:36 WARNING OGG-01004 Oracle GoldenGate Delivery for ODBC, lien.prm: Aborted grouped transaction on 'AWO_CUBE.LIEN', Database error -2147217887 ([SQL error -2147217887 (0x80040e21)] Parameter #: 1 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 2 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 3 Data Type: 129 DB Part: 7 Length: 5 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable Parameter #: 4 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 5 Data Type: 129 DB Part: 7 Length: 8 Max Length: 56 Status: 8 Precision: 56 Scale: 0 Unavailable Parameter #: 6 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 7 Data Type: 129 DB Part: 7 Length: 9 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable Parameter #: 8 Data Type: 129 DB Part: 7 Length: 8 Max Length: 15 Status: 8 Precision: 15 Scale: 0 Unavailable Parameter #: 9 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable Parameter #: 10 Data Type: 129 DB Part: 5 Length: 5 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable Parameter #: 11 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 6 Precision: 23 Scale: 3 Data Overflow Parameter #: 12 Data Type: 129 DB Part: 7 Length: 13 Max Length: 512 Status: 8 Precision: 0 Scale: 0 Unavailable Parameter #: 13 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable Parameter #: 14 Data Type: 129 DB Part: 7 Length: 1 Max Length: 1 Status: 8 Precision: 1 Scale: 0 Unavailable Native Error: 0, 0 State: 0, 22007 Class: 0 Source: Line Number: 0 Description: Invalid date format ).
    report:
    2012-06-02 10:55:36 WARNING OGG-01004 Aborted grouped transaction on 'AWO_CUBE.LIEN', Database error -2147217887 ([SQL error -2147217887 (0x80040e21)]
    Parameter #: 1 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable
    Parameter #: 2 Data Type: 129 DB Part: 5 Length: 9 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable
    Parameter #: 3 Data Type: 129 DB Part: 7 Length: 5 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable
    Parameter #: 4 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable
    Parameter #: 5 Data Type: 129 DB Part: 7 Length: 8 Max Length: 56 Status: 8 Precision: 56 Scale: 0 Unavailable
    Parameter #: 6 Data Type: 129 DB Part: 5 Length: 6 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable
    Parameter #: 7 Data Type: 129 DB Part: 7 Length: 9 Max Length: 128 Status: 8 Precision: 128 Scale: 0 Unavailable
    Parameter #: 8 Data Type: 129 DB Part: 7 Length: 8 Max Length: 15 Status: 8 Precision: 15 Scale: 0 Unavailable
    Parameter #: 9 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable
    Parameter #: 10 Data Type: 129 DB Part: 5 Length: 5 Max Length: 21 Status: 8 Precision: 20 Scale: 0 Unavailable
    Parameter #: 11 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 6 Precision: 23 Scale: 3 Data Overflow
    Parameter #: 12 Data Type: 129 DB Part: 7 Length: 13 Max Length: 512 Status: 8 Precision: 0 Scale: 0 Unavailable
    Parameter #: 13 Data Type: 129 DB Part: 5 Length: 23 Max Length: 29 Status: 8 Precision: 23 Scale: 3 Unavailable
    Parameter #: 14 Data Type: 129 DB Part: 7 Length: 1 Max Length: 1 Status: 8 Precision: 1 Scale: 0 Unavailable
    Native Error: 0, 0
    State: 0, 22007
    Class: 0
    Source: Line Number: 0
    Description: Invalid date format
    Edited by: chris.baron on Jun 3, 2012 10:36 AM

    Not sure if this helps at all...
    Datetime Pairs in Oracle BI (OBIEE) - Days, Hours, Minutes, Seconds
    http://www.kpipartners.com/blog/bid/83328/Datetime-Pairs-in-Oracle-BI-OBIEE-Days-Hours-Minutes-Seconds
    UPDATE: Sorry... didn't see this was for GoldenGate.
    Edited by: 829166 on Jun 22, 2012 7:36 AM

  • Z field dat to be transfer from SO to Purchase Requestion

    Hi
    I am faing one problem
    I have created 1 Z field in Sales Order and the have created the same field in Purchase Requestion.
    Now how i can transfer Z field data from SO to PR. What is the procedure and configuration i have to do.
    Pls confirm.
    Regards
    Vicky

    Hi Pls reply which user Exit , i should apply for transfering Z filed data from SO to PR.
    This is 3rd party process and we have added a Z field but we need to transfering data from SO to PR.
    Which User Exit will use to perform this process

  • Help needed please -Master data loading- error with text loading.

    Hi,
    Iam S.Sushma. Iam getting error while loading master data from flat file- attributes and texts.
    I have loaded attribute data successfully. But when iam monitoring the text data then iam getting following error.
    ""SET THE LANGUAGE FIELD IN THE SOURCE SYSTEM-LONG TEXT.""
    ""RECORD1: SRPID_1:DATA RECORD 1(EN S):DUPLICATE DATA RECORD.
    "ERROR 4 IN UPDATE"
    I took 2flat files for attributes and texts.
    SRPID is key field.(sales representative id)
    attributes:- SRPID, NAME1, AGE1, ADD1,PHONE1.
    text:-  SRPID,LANG,SHTXT,MDTXT,LGTXT.
    Iam just a begginer so need your guidence. Thank you in advance.
    Message was edited by: sushma sandepogu
    Message was edited by: sushma sandepogu

    Hi Eugene,
    Thank you for your guidence. I have changed the order in flat file and got the results successfully. But when iam viewing the Master data text the S10 in coming in 2nd position  in maintain master data-list screen i mean :-
    SRPID LANG NAME AGE ADD PHONE SHTXT MDTXT LGTXT
    S1         EN  ABC   34  GDS 1234  AS    ASD   ASDDF
    S10
    S2
    S3
    S9.
    Iam not understading why. Hope yu can help me. I have assigned full points to you for your previous solution.
    Thank you in advance.
    S.Sushma.

  • Timestamp/Date format error with Java 1.6

    I'm getting this error trying to getObjects from a ResultSet query for an Oracle Lite 10G table that has colums of the TIMESTAMP or DATE type. This works fine under java 1.5. Java 1.6 seems to have broken TIMESTAMP/DATE compatibility with Oracle Lite. Are there any plans to make 10G compatible with Java 1.6? We would like to port our application from Java 1.5 to 1.6, but this is an obstacle. I suppose one work-around would be to use TO_CHAR on all the DATE fields and convert them back to java Dates programatically, but that would be a hassle.
    Update: I changed the column types of the table from TIMESTAMP to DATE. The same exception occurs when calling POLJDBCResultSet.getObject() on the DATE columns. Again, this works fine under Java 1.5, but not 1.6.
    java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
         at java.sql.Timestamp.valueOf(Timestamp.java:194)
         at oracle.lite.poljdbc.LiteEmbResultSet.jniGetDataTimestamp(Native Method)
         at oracle.lite.poljdbc.LiteEmbResultSet.getVal(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getTimestamp(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getObject(Unknown Source)
         at oracle.lite.poljdbc.POLJDBCResultSet.getObject(Unknown Source)

    I just found a pretty easy java work-around for this that doesn't involve changing the table column types or the SQL:
    Check the column type from the ResultSetMetaData before calling ResultSet.getObject(). If the type is DATE, TIMESTAMP or TIME, call ResultSet.getString() which doesn't throw an exception. Then convert the string to a java.util.Date using java.text.SimpleDateFormat. That can then be used to instantiate a Timestamp.
    This seems to work.
    Message was edited by:
    user490596

Maybe you are looking for

  • Curious: How does "Save for Web" reduce the file size?

    I do know HOW TO USE "Save for web" -- no questions there. I'm wanting to understand better what's going on under the hood. When I start with say, an 8x10 JPG at 300 dpi, I understand completely how bringing that down to 72 dpi yields a much smaller

  • Is this compatible with any modem?

    My router is going out and I was going to replace it with the Airport Express but was wondering if it works with all modems.  I have DSL.

  • Aperture with Mobileme

    I have uploaded pictures from within Aperture to Mobileme and have had no problems getting them there. The problem is the picture Size, they are coming in at about 250k every photo with some going to 300k. This may not be a problem for fast broadband

  • Writing values to files

    While writing float values to file, it is written as junk characters. Can someone tell me how to write float values into a textfile. If possible give the example code.Please help me out of it.

  • How to Enable USB Internet Dongles and only Block USB storage device from Group Policy

    Hi , I have a very urgent requirement , Is there a way to disable the USB and only enable to Internet Dongle using Group policy. Regards, Schan.