Issue with field data type NUMBER(22,5) while updating field data.

Hi,
I have a field with data type NUMBER(22,5).
While inserting or updating field data it should take 17 digit and after decimal point 5 digit total 22 digit.
I set 22 for maximumLength property of messageTextInput field.
While updating field I am getting following problems,
1) I provide the value as 1234567890123456789012 (22 digit) and click apply button.
then I got error message as
Error
मैकेनाइजड दो पहिया/तीन पहिया वाहन - Value 1234567890123460000000 exceeds the maximum allowed value (NUMERIC (22, 5))
2)then I insert 12345678901234567.1234 (17 digit before decimal point and 4 digit after decimal point and it didn't accept 5 digit after decimal point) and click apply button.
It insert the value as 12345678901234600 (value changed after Apply)
The I tried with simple SQL insert statement to insert data and it successfully insert 12345678901234567.12345 value.
Any item property I missed here to set?
Please suggest.
Thanks & Regards,
Sagarika

hi
according to
" then I insert 12345678901234567.1234 (17 digit before decimal point and 4 digit after decimal point and it didn't accept 5 digit after decimal point) and click apply button.
It insert the value as 12345678901234600 (value changed after Apply)"
this statement ,it seems that it is definitely applying round or somthing like that function in the code before inserting value in the table("try with select round(12345678901234567.1234 ) from dual u will get the same result = 123456789012345600 ") ,
bcz u r able to insert the same directely into the table using insert statement so kindely have look on the code if something like this is not happening there.try to check code in related EOimpl class
try to to use debugging mode ,and the 23 char problem is related to maximum lenghth of text box make it 23 it will get resolved and u will be able to enter 17 digits with 5 digits after decimal point
thanx pratap

Similar Messages

  • Issue With Sales Document Type in BW.

    Dear Experts,
    I am facing an Issue with Sales Document Type(Object name is 0DOC_TYPE).
    when i am loading data from R/3 to BW It is converting Sales Document type as follow
    Sales Document Type in R/3                         Sales Document Type in BW
    OR                                                                        TA
    CR                                                                        G2
    DR                                                                        L2
    I check with an ABAPer and he told that there is conversion exit for this Object so that's why it is converting when sending to BW. But the user wants to see in Report same as it is in R/3 like OR, CR DR ... etc.
    I have done some investigation and found that the conversion exit (CONVERSION_EXIT_AUART_INPUT and CONVERSION_EXIT_AUART_OUTPUT) are converting the values based in the table TAUUM
    The staructure of TAUUM is
    MANDT(Client)  SPRAS(Language) AUART(Sales document type (not converted)   AUART_SPR(Language key for sales document type)
    and values are
    100 E TA OR
    100 E G2 CR
    etc....
    The same Conversion exits(CONVERSION_EXIT_AUART_INPUT and CONVERSION_EXIT_AUART_OUTPUT) and table TAUUM are available in BW as well. So i thought I can Use InfoObject 0DOC_TYPE by changing it's conversion Routine as AUART(which runs based on the above mentioned conversion exits).
    but the the problem is table TAUUM(Pool Table) does't have any data in BW.
    So can anyone tell me is there any option to load data into table TAUUM from R/3 or any other option to solve this issue.
    Your ideas will really help's me.
    Thanks in advance,
    Dara.

    Hi Venkat,
    Thanks a lot for your immediate response.
    The InfoObject 0DOC_TYPE was without conversion exit by default. but when data coming from R/3 it is converting and sending to BW So that's why i am planning to use conversion exit "AUART" in the info Object.
    I checked data in R/3 using RSA3 it is showing sales document type as "OR" and for the same transaction data when i checked in PSA it is showing as "TA".
    Could you please let me know if there any other options.
    Thanks in advance,
    Dara.

  • Issue with 2LIS_02_SCL Data source

    Hi Everybody,
    I am facing the below 2 issues with 2lis_02_scl data source,
    1) This is fetching only the records  ETENR (Delivery Schedule Line Counter) value with ' 1 ', It is ignoring others ex:2,3 and 4. Hence Data is not reconciling with ECC system.
    2) The standard field GLMNG is not getting any data, Data was existed in table(EKET) level. So i have written the code and data is coming now. But the problem is, This is not considering the ROCANCEL indicator it seems. All the other key figures values are coming in with Negative sign When ROCANCEL Value is ' X ' or ' R ', But this field is getting all the positive values irrespective of ROCANCEL indicator. Hence showing the incorrect values compared to the ECC.
    Can anybody help me on this,
    Regards,
    Gopinath

    Hi Gopinath:
       Have you already applied any SAP Note to solve this problem?
    Please check if the SAP Note below is applicable to your system.
    668177 - "LIS BW: wrong quantity for documents with invoice plan"
    Regards,
    Francisco Milán.

  • I am having a issue with getting data useage alerts for my iphone 4s

    I am having a issue with getting data useage alerts for my iphone 4s from AT&T.  I do not download anything huge at all.
    I looked into it and figured out that the phone dials out nightly at 12:29am every night.   I went into my settings and went to general..about..diagnostics and useage..then diagnostics and useage data to see this.  I then clicked don't send...but I am still getting useage alerts.  Can anyone help me please...
    Thanks

    Honestly, from reading the thread linked, they all come off as a bunch of whiney people that cannot be bothered to help themselves.
    Little to nothing in that thread indicates an issue beyond inept consumers.  Yes, I read several pages on the incessant gripes.  Very few made any actual attempts to troubleshoot issues before whining about the "Apple issue" and those that did actual troubleshooting got their issues resolved.
    So no, Apple has nothing to fix beyond a few specific devices that are experiencing hardware issues.
    If you have actually put forth effort and done the basic troubleshooting, take the device to Apple for evaluation and possible replacement.  Whining will get nothing accomplished.

  • Has anyone had issues with Administration\Data Import/Export\Data Import???

    Has anyone had issues with Administration\Data Import/Export\Data Import???
    I have a client who has recently upgraded from V2007 to V8.81. They were succesfuly  using this standard function to import supplier prices to their master price list, but now it has failed?
    I have looked at the file they are importing and it appears to be fine.
    On closer inspection, it did contain approx 46,000 entries, so I took the first 1,000 and created a test file, which imported fine.
    The only issue I found was Speed, with the test file of 1,000 records taking about 30 Mins to import. This appeared to get slower and slower the further through the file it got!
    Based on this, I have estimated that the whole file would takle about 13 hours to import. The client say that when they used to run it on version 2007 it was far quicker?
    In practice, it does appear to run, but the speed is the issue. Having said this, I set the whole file to run last night (over night)and this morning it had appeared to hang after about 2,307 rows, with nothing else being updated.
    Has anyone any ideas or is aware of performance issues like this?
    Thanks,
    Ian

    Always an option, but would you give your clients access to this tool?
    Not sure really.
    I have uploaded a copy of their database onto my test system and run the same routine. Its equally as SLOW
    I can't gauage if its an issue with 8.81 that 2007 didn't have, as I only have the client's word on it, however I have no reason to disbelieve them.
    Kind regards,
    Ian

  • CVC creation - Strange issue with Master data table of 9AMATNR

    Hi Experts,
    We have encountered a strange issue with Master data table (/BI0/9APMATNR) of info object 9AMATNR.
    We have a BADI implemented for checking the valid Characteristic before creation of the CVC using transaction /SAPAPO/MC62. This BADI puts a select on master data tab of material /BI0/9APMATNR and returns no value. But the material actually exists in the table (checked through SE16).
    Now we go inside the info object 9AMATNR and go to the Master data Tab. There we go inside the master table
    /BI0/9APMATNR and activate that. After activating the table it is read by the select statement inside BADI (Strange) and allows the CVC to be created.
    Ideally it should not allow us to activate the SAP standard table /BI0/9APMATNR. I observed that in technical settings of this table it has single record buffering as switched on. (But as per my knowledge buffer gets refreshed every 2 to 4 mins and not in 2 days or something).
    Your expert comment is valuable to us. Thanks.
    Best Regards,
    Chandan Dubey

    Hi Chandan,
                 Try to use a WAIT statment with 5 seconds before your select statment.
    I'm not sure whether this will work. Anyway check it and let me know the result.
    Regards,
    Siva.

  • Issue with the Data Type 'Number' in Business Objects

    Hi,
    I have an Object in the Universe where the Data Type of the Object is a number. This Column pertaining to this Object has certain values in the database out of which there is a 17 Digit Value which is 00000000031101165.
    Now, when trying to retreive the same value through Business Objects it is getting rounded off to 00000000031101200 automatically when trying to view in Webi and when trying to retreive the same in Designer/Deski, it displays as 0.000000003110116E+16.
    So, I would like to know if there is any other alternative in trying to retreive the Original Value that would not round off. Also, do we have any Limitation for the Data Type Number in Business Objects? The Version we are on is XI3.1.
    Note: There are no functions that are used on this Object at the Universe Level and would not like to use any functions here.

    What is the underlying database?
    It looks like the data is considered to have two decimals, but is rounded to zero decimals.
    Only you don't see the number formatting.
    Is this a BW query?
    Is this a calculated keyfigure?
    In the query you can specify the rounding you want and it is also possible to specify it on an infoobject level.
    Check those settings...
    Hope this helps,
    Marianne
    PS. Oh, and about the formatting, you can specify a default object format in the universe and override it on the final client (WebI, Crystal)

  • Issue with xsd Data type mapping for collection of user defined data type

    Hi,
    I am facing a issue with wsdl for xsd mapping for collection of user defined data type.
    Here is the code snippet.
    sample.java
    @WebMethod
    public QueryPageOutput AccountQue(QueryPageInput qpInput)
    public class QueryPageInput implements Serializable, Cloneable
    protected Account_IO fMessage = null;
    public class QueryPageOutput implements Serializable, Cloneable
    protected Account_IO fMessage = null;
    public class Account_IO implements Serializable, Cloneable {
    protected ArrayList <AccountIC> fintObjInst = null;
    public ArrayList<AccountIC>getfintObjInst()
    return (ArrayList<AccountIC>)fintObjInst.clone();
    public void setfintObjInst(AccountIC val)
    fintObjInst = new ArrayList<AccountIC>();
    fintObjInst.add(val);
    Public class AccountIC
    protected String Name;
    protected String Desc;
    public String getName()
    return Name;
    public void setName(String name)
    Name = name;
    For the sample.java code, the wsdl generated is as below:
    <?xml version="1.0" encoding="UTF-8" ?>
    <wsdl:definitions
    name="SimpleService"
    targetNamespace="http://example.org"
    xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/"
    xmlns:tns="http://example.org"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:mime="http://schemas.xmlsoap.org/wsdl/mime/"
    xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/"
    xmlns:soap12="http://schemas.xmlsoap.org/wsdl/soap12/"
    >
    <wsdl:types>
    <xs:schema version="1.0" targetNamespace="http://examples.org" xmlns:ns1="http://example.org/types"
    xmlns:xs="http://www.w3.org/2001/XMLSchema">
    <xs:import namespace="http://example.org/types"/>
    <xs:element name="AccountWSService" type="ns1:accountEMRIO"/>
    </xs:schema>
    <xs:schema version="1.0" targetNamespace="http://example.org/types" xmlns:ns1="http://examples.org"
    xmlns:tns="http://example.org/types" xmlns:xs="http://www.w3.org/2001/XMLSchema">
    <xs:import namespace="http://examples.org"/>
    <xs:complexType name="queryPageOutput">
    <xs:sequence>
    <xs:element name="fSiebelMessage" type="tns:accountEMRIO" minOccurs="0"/>
    </xs:sequence>
    </xs:complexType>
    <xs:complexType name="accountEMRIO">
    <xs:sequence>
    <xs:element name="fIntObjectFormat" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageType" type="xs:string" minOccurs="0"/>
    <xs:element name="fMessageId" type="xs:string" minOccurs="0"/>
    <xs:element name="fIntObjectName" type="xs:string" minOccurs="0"/>
    <xs:element name="fOutputIntObjectName" type="xs:string" minOccurs="0"/>
    <xs:element name="fintObjInst" type="xs:anyType" minOccurs="0" maxOccurs="unbounded"/>
    </xs:sequence>
    </xs:complexType>
    <xs:complexType name="queryPageInput">
    <xs:sequence>
    <xs:element name="fPageSize" type="xs:string" minOccurs="0"/>
    <xs:element name="fSiebelMessage" type="tns:accountEMRIO" minOccurs="0"/>
    <xs:element name="fStartRowNum" type="xs:string" minOccurs="0"/>
    <xs:element name="fViewMode" type="xs:string" minOccurs="0"/>
    </xs:sequence>
    </xs:complexType>
    </xs:schema>
    <schema xmlns="http://www.w3.org/2001/XMLSchema" targetNamespace="http://example.org"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:tns="http://example.org" xmlns:ns1="http://example.org/types">
    <import namespace="http://example.org/types"/>
    <xsd:complexType name="AccountQue">
    <xsd:sequence>
    <xsd:element name="arg0" type="ns1:queryPageInput"/>
    </xsd:sequence>
    </xsd:complexType>
    <xsd:element name="AccountQue" type="tns:AccountQue"/>
    <xsd:complexType name="AccountQueResponse">
    <xsd:sequence>
    <xsd:element name="return" type="ns1:queryPageOutput"/>
    </xsd:sequence>
    </xsd:complexType>
    <xsd:element name="AccountQueResponse" type="tns:AccountQueResponse"/>
    </schema>
    </wsdl:types>
    <wsdl:message name="AccountQueInput">
    <wsdl:part name="parameters" element="tns:AccountQue"/>
    </wsdl:message>
    <wsdl:message name="AccountQueOutput">
    <wsdl:part name="parameters" element="tns:AccountQueResponse"/>
    </wsdl:message>
    <wsdl:portType name="SimpleService">
    <wsdl:operation name="AccountQue">
    <wsdl:input message="tns:AccountQueInput" xmlns:ns1="http://www.w3.org/2006/05/addressing/wsdl"
    ns1:Action=""/>
    <wsdl:output message="tns:AccountQueOutput" xmlns:ns1="http://www.w3.org/2006/05/addressing/wsdl"
    ns1:Action=""/>
    </wsdl:operation>
    </wsdl:portType>
    <wsdl:binding name="SimpleServiceSoapHttp" type="tns:SimpleService">
    <soap:binding style="document" transport="http://schemas.xmlsoap.org/soap/http"/>
    <wsdl:operation name="AccountQue">
    <soap:operation soapAction=""/>
    <wsdl:input>
    <soap:body use="literal"/>
    </wsdl:input>
    <wsdl:output>
    <soap:body use="literal"/>
    </wsdl:output>
    </wsdl:operation>
    </wsdl:binding>
    <wsdl:service name="SimpleService">
    <wsdl:port name="SimpleServicePort" binding="tns:SimpleServiceSoapHttp">
    <soap:address location="http://localhost:7101/WS-Project1-context-root/SimpleServicePort"/>
    </wsdl:port>
    </wsdl:service>
    </wsdl:definitions>
    In the above wsdl the collection of fintObjInst if of type xs:anytype. From the wsdl, I do not see the xsd mapping for AccountIC which includes Name and Desc. Due to which, when invoking the web service from a different client like c#(by creating proxy business service), I am unable to set the parameters for AccountIC. I am using JAX-WS stack and WLS 10.3. I have already looked at blog http://weblogs.java.net/blog/kohlert/archive/2006/10/jaxws_and_type.html but unable to solve this issue. However, at run time using a tool like SoapUI, when this wsdl is imported, I am able to see all the params related to AccountIC class.
    Can some one help me with this.
    Thanks,
    Sudha.

    Did you try adding the the XmlSeeAlso annotation to the webservice
    @XmlSeeAlso({<package.name>.AccountIC.class})
    This will add the schema for the data type (AccountIC) to the WSDL.
    Hope this helps.
    -Ajay

  • Issue with field separator in GUI_UPLOAD

    Hello Gurus
    I am facing issue with gui_upload. I have a text file in which the fields are eparated by single Pipe  i.e |. Now when I try to read the data from file in internal table even with using field separator it does not insert data in proper fields.
    DATA: BEGIN OF IT_TAB OCCURS 0,
          OBJECT_ID type string,
          VERSION_SERIES_ID TYPE string,
          VERSION_NUMBER TYPE string,
          REVISION TYPE string,
          DOC_NUMBER TYPE string,
          DOCTITLE TYPE string,
          FILESIZE TYPE string,
          MIME_TYPE TYPE string,
          PLANTUNIT TYPE string,
          END OF IT_TAB.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        FILENAME                      = w_mpath
        FILETYPE                      = 'ASC'
        HAS_FIELD_SEPARATOR           = '|'
      HEADER_LENGTH                 = 0
       READ_BY_LINE                  = 'X'
      DAT_MODE                      = ' '
      CODEPAGE                      = ' '
      IGNORE_CERR                   = ABAP_TRUE
      REPLACEMENT                   = '#'
      CHECK_BOM                     = ' '
      VIRUS_SCAN_PROFILE            =
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      HEADER                        =
      TABLES
        DATA_TAB                      = it_tab[]
    EXCEPTIONS
       FILE_OPEN_ERROR               = 1
       FILE_READ_ERROR               = 2
       NO_BATCH                      = 3
       GUI_REFUSE_FILETRANSFER       = 4
       INVALID_TYPE                  = 5
       NO_AUTHORITY                  = 6
       UNKNOWN_ERROR                 = 7
       BAD_DATA_FORMAT               = 8
       HEADER_NOT_ALLOWED            = 9
       SEPARATOR_NOT_ALLOWED         = 10
       HEADER_TOO_LONG               = 11
       UNKNOWN_DP_ERROR              = 12
       ACCESS_DENIED                 = 13
       DP_OUT_OF_MEMORY              = 14
       DISK_FULL                     = 15
       DP_TIMEOUT                    = 16
       OTHERS                        = 17
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    I want the data to get appended in internal table based on | separator.
    Please help.
    Regards,
    Rajesh.

    Hi,
    I believe the field separator parameter will work for Excel files..You have to get the internal table in a string format..and then use split statement..
    check this example..
    DATA: BEGIN OF it_tab OCCURS 0,
           object_id TYPE string,
           version_series_id TYPE string,
           version_number TYPE string,
           revision TYPE string,
           doc_number TYPE string,
           doctitle TYPE string,
           filesize TYPE string,
           mime_type TYPE string,
           plantunit TYPE string,
         END OF it_tab.
    DATA: t_tab   TYPE TABLE OF string,
          v_string TYPE string.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename                = 'C:\TEST.TXT'
    *    has_field_separator     = '|'          "Actually not required.
      TABLES
        data_tab                = t_tab
      EXCEPTIONS
        file_open_error         = 1
        file_read_error         = 2
        no_batch                = 3
        gui_refuse_filetransfer = 4
        invalid_type            = 5
        no_authority            = 6
        unknown_error           = 7
        bad_data_format         = 8
        header_not_allowed      = 9
        separator_not_allowed   = 10
        header_too_long         = 11
        unknown_dp_error        = 12
        access_denied           = 13
        dp_out_of_memory        = 14
        disk_full               = 15
        dp_timeout              = 16
        OTHERS                  = 17.
    IF sy-subrc <> 0.
    ENDIF.
    LOOP AT t_tab INTO v_string.
      SPLIT v_string AT ','
            INTO
            it_tab-object_id
            it_tab-version_series_id
            it_tab-version_number
            it_tab-revision
            it_tab-doc_number
            it_tab-doctitle
            it_tab-filesize
            it_tab-mime_type
            it_tab-plantunit.
      APPEND it_tab.
      CLEAR: it_tab.
    ENDLOOP.
    Thanks
    Naren
    Edited by: Narendran Muthukumaran on Oct 15, 2008 4:58 PM

  • Issue with the data of attachment of the mail with FM 'SO_NEW_DOCUMENT_ATT'

    Hi Gurus,
    I am developing my program in SAP R/3 4.7 system.
    I have a requirement to attach a txt file to the email. I am able to attach the file and fill the contents of the file.
    But the issue is with the data in the file. There are no line breaks in the data of file and all the words are saperated by space.
    For EG: If I have filled 2 records(line1,line2) in the internal table provided to content_bin field, the data in the attachment looks like this.
    l i n e 1 l i n e 2
    but I want the data to  be  like below:
    line1
    line2
    Below is my code:
    Fill the document data and get size of attachment
         DATA:w_cnt TYPE i.
        DATA: WA_ATTACH LIKE LINE OF P_ATTACHMENT.
      DESCRIBE TABLE p_attachment LINES  w_cnt.
      READ TABLE p_attachment  INTO WA_ATTACH INDEX w_cnt.
    Fill the document data and get size of attachment
      gd_doc_data-doc_size =
         ( w_cnt - 1 ) * 255 + STRLEN( WA_ATTACH ).
    Populate the subject/generic message attributes
      gd_doc_data-obj_langu = sy-langu.
      gd_doc_data-obj_name  = 'EMAIL'.
      gd_doc_data-obj_descr = sub.
      gd_doc_data-sensitivty = 'F'.
    Describe the body of the message
      clear it_packing_list.
      refresh it_packing_list.
      it_packing_list-transf_bin = space.
      it_packing_list-head_start = 1.
      it_packing_list-head_num = 0.
      it_packing_list-body_start = 1.
      describe table it_message lines it_packing_list-body_num.
      it_packing_list-doc_type = 'RAW'.
      append it_packing_list.
    Create attachment notification
      it_packing_list-transf_bin = 'X'.
      it_packing_list-head_start = 1.
      it_packing_list-head_num   = 1.
      it_packing_list-body_start = 1.
      DESCRIBE TABLE it_attachment LINES it_packing_list-body_num.
      it_packing_list-doc_type   =  'RAW'.
      it_packing_list-obj_descr  =  'Job Log'.
      it_packing_list-obj_name   =  'Job log'.
      it_packing_list-doc_size   =  it_packing_list-body_num * 255.
      APPEND it_packing_list.
    Add the recipients email address
      clear it_receivers.
      refresh it_receivers.
      loop at i_t_list_of_emailids INTO i_t_list_of_emailids_wa.
      it_receivers-receiver = i_t_list_of_emailids_wa-emailid.
      it_receivers-rec_type = 'U'.
      it_receivers-com_type = 'INT'.
      it_receivers-notif_del = 'X'.
      it_receivers-notif_ndel = 'X'.
      it_receivers-express    = 'X'.
      append it_receivers.
    ENDLOOP.
    data:   new_object_id like sofolenti1-object_id.
    Call the FM to post the message to SAPMAIL
      call function 'SO_NEW_DOCUMENT_ATT_SEND_API1'
           exporting
                document_data              = gd_doc_data
                put_in_outbox              = 'X'
                COMMIT_WORK                = 'X'
           importing
                sent_to_all                = gd_sent_all
                new_object_id              = new_object_id
           tables
                packing_list               = it_packing_list
                contents_bin               = p_attachment
                contents_txt               = it_message
                receivers                  = it_receivers
           exceptions
                too_many_receivers         = 1
                document_not_sent          = 2
                document_type_not_exist    = 3
                operation_no_authorization = 4
                parameter_error            = 5
                x_error                    = 6
                enqueue_error              = 7
                others                     = 8.
    Kindly let me know to procedure to solve the iisue
    Thanks in advance,
    Kiran Kumar K

    Hi Kira,
    you can try to use a CR_LF to separate the records.
    constants: con_cret type c value cl_abap_char_utilities=>CR_LF.
    And when you append your record to the internal table add to the end the con_cret.
      LOOP AT t_build into wa_charekpo.
        CONCATENATE wa_charekpo-order wa_charekpo-kunnr
                    wa_charekpo-sel
               INTO it_attach SEPARATED BY con_tab.
        CONCATENATE con_cret it_attach  INTO it_attach.
        APPEND  it_attach.
      ENDLOOP.

  • Data Pump issue with nls date format

    Hi Friends,
    I have a database with nls date format 'DD/MM/YYYY HH24:MI:SS' from where I wish to take export from. I have a target database with nls date format 'YYYY/MM/DD HH24:MI:SS' . I have a few tables whose create statements have some date fields with DEFAULT '01-Jan-1950' and these CREATE TABLE statements when processed by Data pump is getting failed in my target database. These tables are not getting created due to this error
    Failing sql is:
    CREATE TABLE "MCS_OWNER"."SECTOR_BLOCK_PEAK" ("AIRPORT_DEPART" VARCHAR2(4) NOT NULL ENABLE, "AIRPORT_ARRIVE" VARCHAR2(4) NOT NULL ENABLE, "CARRIER_CODE" VARCHAR2(3) NOT NULL ENABLE, "AC_TYPE_IATA" VARCHAR2(3) NOT NULL ENABLE, "PEAK_START" VARCHAR2(25) NOT NULL ENABLE, "PEAK_END" VARCHAR2(25), "BLOCK_TIME" VARCHAR2(25), "FLIGHT_TIME" VARCHAR2(25), "SEASON" VARC
    ORA-39083: Object type TABLE failed to create with error:
    ORA-01858: a non-numeric character was found where a numeric was expected
    The table create sql which adds a column as ' VALID_FROM DATE DEFAULT '01-jan-1970' ' which I think is the issue. Appreciate if someone can suggest a way to get around with this. I have tried altering the nls of source db to be same as target database. Still the impdp fails.
    Database is 10.2.0.1.0 on Linux X86 64 bit.
    Thanks,
    SSN
    Edited by: SSNair on Oct 27, 2010 8:25 AM

    Appreciate if someone can suggest a way to get around with this.change the DDL that CREATE TABLE to include TO_DATE() function.
    With Oracle characters between single quote marks are STRINGS!
    'This is a string, 2009-12-31, not a date'
    When a DATE datatype is desired, then use TO_DATE() function.

  • Issues with Prompts date

    Hi,
    I have created two date calender prompts from the same date column of the time dimension 'from date' and 'to date' and saved those into Presentation variables.
    I have to show 'amount' into one column(say opening amount) if transaction date is before 'from date'. and into another column (say debit) when transaction date is after 'from date'.When in criteria tab I try to enter the formula something like (case when transaction_date > @{from_date} then amount end) it is giving me error.
    nQSError: 46030] Field descriptor id 0 is beyond the maximum field count 0
    Can anyone has come across the same issue?
    Regards,
    Vikas

    Please don't change anything to OBIEE.
    You must do the difference between the data type date and date format.
    When you create a dashboard prompt with a date column and you set up a presentation variable. The presentation variable take the same data type. It become a date.
    The format is specifically the work of BI Presentation service to take this date and to transform it in the way that want the user (depending of the language).
    What I would say, is that you must set up in the default value of the filter also a date (and therefore not a character).
    Unfortunately when you enter this :
    DAY is greater than or equal to @{from_date}{1990/6/6}Obiee see it as a character
    And what I have discover is that if you want a date the cast function don't follow the format depending of the configuration, you have to use this format :
    DAY is greater than or equal to @{from_date}{cast('1990-06-06' as date)}In this way, you will have two datea where the format change when you change the local definition (to French for instance) in your account profile.
    You can also use this form :
    DAY is greater than or equal to @{from_date}{date '1990-06-06'}Edited by: gerardnico on Jun 24, 2009 7:38 PM - suppress a bad bracket

  • Performance issue with Oracle data source

    Hi all,
    I've a rather strange problem that I'm stuck on need some assistance on.
    I have a rules file which drags data in via an SQL data source thats an Oracle server. If I cut/paste the 3 sections of "select" "from" and "where" into SQL-Developer and run the query, it takes less than 1 second to complete. When I run the "load data" with this rule file or even use the "Retrieve" with the rules file edit, it takes up to an hour to complete/retrieve the data.
    The table in question being used has millions of rows and I'm using one of the indexed fields to retrieve the data. It's as if the Essbase/Rule file is ognoring the index, or I have a config issue with the ODBC settings on the server that is causing the problem.
    ODBC.INI file entry for the Oracle server as follows (changed any sensitive info to xxx or 999).
    [XXX]
    Driver=/opt/data01/hyperion/common/ODBC-64/Merant/5.2/lib/ARora22.so
    Description=DataDirect 5.2 Oracle Wire Protocol
    AlternateServers=
    ApplicationUsingThreads=1
    ArraySize=60000
    CachedCursorLimit=32
    CachedDescLimit=0
    CatalogIncludesSynonyms=1
    CatalogOptions=0
    ConnectionRetryCount=0
    ConnectionRetryDelay=3
    DefaultLongDataBuffLen=1024
    DescribeAtPrepare=0
    EnableDescribeParam=0
    EnableNcharSupport=0
    EnableScrollableCursors=1
    EnableStaticCursorsForLongData=0
    EnableTimestampWithTimeZone=0
    HostName=999.999.999.999
    LoadBalancing=0
    LocalTimeZoneOffset=
    LockTimeOut=-1
    LogonID=xxx
    Password=xxx
    PortNumber=1521
    ProcedureRetResults=0
    ReportCodePageConversionErrors=0
    ServiceType=0
    ServiceName=xxx
    SID=
    TimeEscapeMapping=0
    UseCurrentSchema=1
    Can anyone please advise on this lack of performance.
    Thanks in advance
    Bagpuss

    One other thing that I've seen is that if your Oracle data source and Essbase server are in different geographic locations, you can get some delay when it retrieves data over the WAN. I guess there is some handshaking going on when passing the data from Oracle to Essbase (either by record or groups of records) that is slowed WAY down over the WAN.
    Our solution to this was remove teh query out of the load rule, run it via SQL+ on a command line at the geographic location where the Oracle database is, then ftp the resulting file to where the Essbase server is.
    With upwards of 6 million records being retrieved, it took around 4 hours in the load rule, but running the query via command line took 10 minutes, then the ftp took less than 5.

  • 10g Reports issue with XML Data Source

    Hi,
    Has anybody ever encountered an issue with Oracle 10g report using an XML as the data source? What happens is, some of the values in the XML are printed to the wrong column.
    One of the elements in our XML file is a complex type with 10 elements under it. The first 5 are picked up properly, but the last 6 are not. Elements #6 to #9 has a minimum occurence of 0. What happens is when element #6 is present, but #7 is, the value for element #7 is passed on to element #6.
    The XSD and XSL files are both valid since the reports were working when we were still using 9i. There is no hidden logic in the report which might cause this issue to come up, i.e., the report just picks up the values from the XML and prints it to the appropriate columns.
    Any help will be greatly appreciated.

    XSD used
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">
            <!-- trade instructions detail & trailer -->
            <xs:element name="TradeDetail">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="TradeType"/>
                                    <xs:element ref="TradeID"/>
                                    <xs:element ref="TradeDate"/>
                                    <xs:element ref="FundID"/>
                                    <xs:element ref="FundName"/>
                                    <xs:element ref="DollarValue" minOccurs="0"/>
                                    <xs:element ref="UnitValue" minOccurs="0"/>
                                    <xs:element ref="PercentageValue" minOccurs="0"/>
                                    <xs:element ref="OriginalTradeID" minOccurs="0"/>
                                    <xs:element ref="CancellationFlag"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <xs:element name="Instruction">
                    <xs:complexType>
                            <xs:sequence minOccurs="0">
                                    <xs:element ref="TradeDetail" maxOccurs="unbounded"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- overall trade instruction message -->
            <xs:element name="InterchangeHeader">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="Instruction"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- definition of simple elements -->
            <xs:element name="FundID" type="xs:string"/>
            <xs:element name="TradeType" type="xs:string"/>
            <xs:element name="TradeID" type="xs:string"/>
            <xs:element name="TradeDate" type="xs:string"/>
            <xs:element name="FundName" type="xs:string"/>
            <xs:element name="DollarValue" type="xs:decimal"/>
            <xs:element name="UnitValue" type="xs:decimal"/>
            <xs:element name="PercentageValue" type="xs:decimal"/>
            <xs:element name="OriginalTradeID" type="xs:string"/>
            <xs:element name="CancellationFlag" type="xs:string"/>
    </xs:schema>
    XML used
    <?xml version = '1.0' encoding = 'UTF-8'?>
    <InterchangeHeader xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="TradeInstruction.xsd">
       <Instruction>
          <TradeDetail>
             <TradeType>Purchase</TradeType>
             <TradeID>M000038290</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>ABN Fund</FundName>
             <DollarValue>2111.53</DollarValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>Redemption</TradeType>
             <TradeID>M000038292</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>AMRO Equity Fund</FundName>
             <UnitValue>104881.270200</UnitValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>ISPurchase</TradeType>
             <TradeID>M000038312</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>MLC0011AU</FundID>
             <FundName>Cash Fund</FundName>
             <OriginalTradeID>M000038311</OriginalTradeID>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
       </Instruction>
    </InterchangeHeader>
    XSLT used
    <?xml version="1.0" encoding="UTF-8"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
            <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
            <xsl:template match="/">
                    <InterchangeHeader>
                            <xsl:for-each select="InterchangeHeader/Instruction/TradeDetail">
                            <xsl:sort select="FundName"/>
                            <xsl:sort select="TradeDate"/>
                                    <TradeDetail>
                                            <TradeType><xsl:value-of select="TradeType"/></TradeType>
                                            <TradeID><xsl:value-of select="TradeID"/></TradeID>
                                            <TradeDate><xsl:value-of select="TradeDate"/></TradeDate>
                                            <FundID><xsl:value-of select="FundID"/></FundID>
                                            <FundName><xsl:value-of select="FundName"/></FundName>
                                            <DollarValue><xsl:value-of select="DollarValue"/></DollarValue>
                                            <UnitValue><xsl:value-of select="UnitValue"/></UnitValue>
                                            <PercentageValue><xsl:value-of select="PercentageValue"/></PercentageValue>
                                            <OriginalTradeID><xsl:value-of select="OriginalTradeID"/></OriginalTradeID>
                                            <CancellationFlag><xsl:value-of select="CancellationFlag"/></CancellationFlag>
                                    </TradeDetail>
                            </xsl:for-each>
                    </InterchangeHeader>
            </xsl:template>
    </xsl:stylesheet>

  • Table - Issue with Logging Data Changes

    Hi Experts,
    My client's requirement is to log data changes happened in a table which is critical. I've selected 'Log data changes' check box in technical settings of the table. Now, I'm able to see log for all the changes i did after that. But, I'm facing following issues. Kindly help me in fixing the same.
    1. The table is having two fields with same data element. In the change log SCU3, under the field name, field label mentioned in the data element is getting displayed. As, the data element is same for two fields, its difficult to find out which particular field is changed. Is there any way to display table field name instead of field label from data element?
    2. Sometime, even there is change in the data record, I'm able to see a log record in the change log with caption 'Data record unchanged'. What might be the problem and how to resolve?
    Thanks in Advance,
    Siva Sankar.

    Hi Moha Nan,
       You just use ABAP to change the table P? In this way I think it is not change to X table.
    You can upload master data to add navigation,it is a good way to bulid the SID of P table and X
    table .
      Hope it is help to you!

Maybe you are looking for