Error loading the data into ODS - Message no. BRAIN060?

Hi,
I am getting following error while load the data from flat file, data loaded successfully from flat file to PSA but I got following error while updating the data from PSA to data target:
Value '010384 javablue' (hex. '30003100300033003800340020006A0061007600610062006C') of characteristic 0PO_NUMBER contains invalid characters
Message no. BRAIN060
Diagnosis
The following standard characters are valid in characteristic values as default:
!"%&''()*+,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
Characteristic values are not allowed if they only consist of the character "#" or begin with "!". If the characteristic is compounded, this also applies to each partial key.
You are trying to load the invalid characteristic value 1. (hexidecimal representation 30003100300033003800340020006A0061007600610062006C).
I am trying to load Value '010384 javablue' to "0PO_NUMBER" info object for ODS with in one row with some other data.
Any idea or any input to resolve this issue?
Thanks in advance for any input.
Steve

Thanks Soumya, I have maintained the upper case letters, but I am loading upper and lower case mixed to PO number? and it is not working. What is the solution to this? If I set lower case property to the PO number infoobject, it won't accept upper case. If I uncheck the lower case then it won't accept lower case letters. I cannot add upper and lower case letters in RSKC, because it accepts up to 72 characters, I have already have more than 60 characters (special char, numbers and 26 upper case letters).
I have already tried transfer routine but you can either convert to lower or upper but it doesn't work. we need both uper and lower case for po number, and R/3 is accepting it. why BW doesn't accept both?
Any idea what can be done?
Thanks in advance for your help.
Steve

Similar Messages

  • Error loading the data into cube

    Hi All,
    wen i am loading the data in the ASO Cube its giving warning but i did every thing correct yester day it was loaded correctly buttoday its not loading its giving warning ASO application ignore update to derived cells even i clear the data wats this error
    so plz can any one help
    Reading Rule SQL Information For Database accdb
    Reading Rules From Rule Object For Database accdb
    Parallel dataload enabled: [1] block prepare threads, [1] block write threads.
    Aggregate storage applications ignore update to derived cells. 29 cells skipped
    Data Load Elapsed Time with http://dataldcu.rul : http://0.094 seconds
    Database import completed
    Output columns prepared: [0]
    Thanks
    Ram

    In ASO cubes you can only load data at level 0 (=leaf level). Every upload to aggregated dimension members are ignored. The warning message "ignore update to derived cells" is meaning that you are trying to load data at node level. As this is not possible with ASO cubes, essbase ignores that data.

  • Regarding loading the data into ODS

    Hi all,
    I am having a sitaution where I had filled an ODS with data. Now few fields have to be added to the ODS for reporting purpose. I have added the fields. But I am having doubt in how to fill those fields alone in that ODS so that the data can be represented in the Reports on that ODS. Is there any prerequits and precautions that have to be taken????
    Regards
    YJ

    Hi,
    Just write a prog and execute it to fill the added field.
    ex: if sy-subrc <> 0.
      message s000 with 'No records selected for the specified criteria.'.
    else.
      loop at int_tab.
        update /bic/aODS00 set
                  added_field= int_tab-added_field
              where condition.
        if sy-subrc = 0.
          counter = counter + 1.
        endif.
      endloop.

  • Error While loading the data into PSA

    Hi Experts,
           I have already loaded the data into my cube. But it doesn't have values for some fields. So again i modified the data in the flat file and tried to load them into PSA. But while starting the Info Package, I got an error saying that,
    "Check Load from InfoSource    
    Created       YOKYY  on   20.02.2008   12:44:33 
    Check Load from InfoSource , Packet IP_DS_C11
    Please execute the mail for additional information.
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Pls help me in this......
    With Regards,
    Yokesh.

    Hi,
    After editing the file did you save the file and close it.
    This error may come if your file was open at the time of request.
    Also did you check the file path settings.
    If everything is correct try saving the infopackage once and loading again.
    Thanks,
    JituK

  • Error while loading the data from ODS to InfoCube

    hai
    Im trying to load the data from ODS to InfoCube for particular year .
    But it says that there is a source system problem .
    why it is like that .
    pls tell me
    i ll assing the points
    rizwan

    Hi Rizwan,
    you didn't mention the error message in details. there could be a few places to be checked:
    - check if BW itself source system is active and in tact and reactivate if necessary
    - check if update rule is active and reactivate if necessary
    - check if ODS is active and reactivate if necessary
    Regards,
    Lilly

  • Loading Master Data into ODS

    Hi,
         Can anyone tell me how to load the master data into the ODS?
    Thanks
    Yadav
    Edited by: yadav on Apr 12, 2008 8:04 PM

    Dear Yadav,,
    U can load the master data to ur ODS..
    The Process is same as loading Transaction Data into ODS but..
    1. U need to do  flexible Update instead of Direct Update...
    2. Maintain all the Attributes in ur DATA part...
    3. The record on which Updation depends should reside in the KEY FIELD section.
    But there is difference in loading master data to info object and ODS..
    If loaded in Info Objects..REusability is there.
    But in ODS no reusability...
    Hope this helps u...
    Assign points if helpful...
    Best Regards,
    VVenkat..

  • How to write a procedure to load the data into a table using xml file as input to the procedure?

    Hi,
    Iam new to the xml,
    can u please anyone help me how to write procedure to load the data into a table using xml as input parameter to a procedure and xml file is as shown below which is input to me.
    <?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>.
    Regards,
    vikram.

    here is the your XML parse in 11g :
    select *
      from xmltable('//Entity' passing xmltype
    '<?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>
    ') columns
      "dcode" varchar2(4000) path '/Entity/dcode',
      "ddesc" varchar2(4000) path '/Entity/ddesc',
      "reauthflag" varchar2(4000) path '/Entity/reauthflag'
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    Using this parser you can create procedure as
    SQL> create or replace procedure myXMLParse(x clob) as
      2  begin
      3    insert into MyXmlTable
      4      select *
      5        from xmltable('//Entity' passing xmltype(x) columns "dcode"
      6                      varchar2(4000) path '/Entity/dcode',
      7                      "ddesc" varchar2(4000) path '/Entity/ddesc',
      8                      "reauthflag" varchar2(4000) path '/Entity/reauthflag');
      9    commit;
    10  end;
    11 
    12  /
    Procedure created
    SQL>
    SQL>
    SQL> exec myXMLParse('<?xml version="1.0"?><DiseaseCodes><Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity></DiseaseCodes>');
    PL/SQL procedure successfully completed
    SQL> select * from MYXMLTABLE;
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    SQL>
    Ramin Hashimzade

  • Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel

    Hi,
    We are trying to load several Excel files into SQL Server SSIS and we need to save the data in the database in the same order as it is in Excel..
    My question is, Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel. If so, we will add a sequence to ensure we have the order.
    Please advise.
    Thanks & Regards,
    Dhanumjay

    Thanks for your response.
    If it is one file then we can add an index column, but we have to load hundreds of files.
    How adding an index/key column to the table works unless SSIS picks and loads the data in the table in the same order as it is in Excel?
    Just to explain my question better,
    If excel has three rows  {a,b},{c,d},{e,f}, does SSIS ensure it loads the data in the same order in a table.
    Thanks.

  • Taking the data from interactive forms and load the data into SAP system?

    hi all,
    I want to know how to take the data from interactive forms and load the data into sap system?
    if u have any sample scenario, explain with that.
    thanks in advance
    Raja

    Hello,
    Check the program...
    SAPBC480_DEMO.
    Check the below threads
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/88e7ea34-0501-0010-95b0-ed14cfbeb85a
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/bfbcd790-0201-0010-679d-e36a3c6b89fa
    Thanks
    Seshu

  • Error when adding data into ODS

    I have created a SS->Info area->Infoobject catalog->Info objects->Application component->Info source->ODS->Info cube.This is how i have created my structure.And i have created the appropriate transfer rules and update rules.And am choosing automatic update of data into the info cube.Say i have a set of 10 transactional data which i am loading and i have choosen initialize delta process and initilization with data transfer in the update and i have loaded the data.I get the correct resulting my 10 datas in the infocube but when i am trying to add 5 more data along with this 10 and loading it in i am not getting the result it shows some error and i have choosen delta process in the update before loading..can some one tell me why i get a error.

    Hi,
    did you make settings in the "ODS Activate ODS data"?. if this was not done then then you have to activate manually then system will update data to cube automatically.
    So you missed the settings for activation.
    Hope this will help.
    Cheers
    Manju

  • Error 'Loading Textfile data into external table'

    I am using Apex 2.0, Oralce Database 10G and Internet Explorer (version 6.5)
    I tried to run following code using SQLCommands of SQLWORKSHOP of Apex
    Code Here:
    declare
    ddl1 varchar2(200);
    ddl2 varchar2(4000);
    begin
    ddl1 := 'create or replace directory data_dir as
    ''C:\LAUSD_DATA\''';
    execute immediate ddl1;
    ddl2:= 'create table tbl_temp_autoload
    (ID NUMBER,
         RECTYPE VARCHAR2(2),
         EXPORTNBR NUMBER(2),
         EXPORTDATE DATE,
         BIMAGEID VARCHAR2(11),
         APPLICNBR NUMBER(10),
         MEALIMAGE VARCHAR2(17),
         MEDIIMAGE VARCHAR2(17),
         LANGUAGE VARCHAR2(1),
         APPLICCNT NUMBER(1),
         OTHAPPLICNBR VARCHAR2(10),
         PEDETDATE DATE,
         PESTATUS VARCHAR2(1),
         ERRORREMARKS VARCHAR2(50),
         COMMENTS VARCHAR2(50),
         SID1 VARCHAR2(10),
         WID1 NUMBER(10),
         MEDIROW1 VARCHAR2(1),
         LASTNAME1 VARCHAR2(20),
         FIRSTNAME1 VARCHAR2(20),
         SCHOOL1 VARCHAR2(15),
         LOCCD1 VARCHAR2(4),
         BIRTHDATE1 DATE,
         CASENBR1 VARCHAR2(15),
         FOSTER1 VARCHAR2(1),
         CHILDINC1 VARCHAR2(4),
         RACEAIAN VARCHAR2(1),
         RACEBAA VARCHAR2(1),
         RACENHPI VARCHAR2(1),
         RACEASIAN VARCHAR2(1),
         RACEWHITE VARCHAR2(1),
         HISPANIC VARCHAR2(1),
         NOTHISPANIC VARCHAR2(1),
         ADULTSIG3 VARCHAR2(1),
         ADULTSIGDATE3 VARCHAR2(10),
         ADULTNAME3 VARCHAR2(30),
         SSN3 VARCHAR2(1),
         SSN3NOT VARCHAR2(1),
         ADDRESS VARCHAR2(30),
         APTNBR VARCHAR2(6),
         CTY VARCHAR2(20),
         ZIP NUMBER(5),
         PHONEHOME VARCHAR2(12),
         PHONEWORK VARCHAR2(12),
         PHONEEXTEN VARCHAR2(6),
         MEALROWA VARCHAR2(1),
         LNAMEA VARCHAR2(20),
         FNAMEA VARCHAR2(20),
         MINITA VARCHAR2(6),
         GENDERA VARCHAR2(1),
         AGEA NUMBER(2),
         MOTYPEA VARCHAR2(1),
         MONAMEA VARCHAR2(1),
         FATYPEA VARCHAR2(1),
         FANAMEA VARCHAR2(1),
         FAMNBR NUMBER(1),
         PARENTINC NUMBER(5),
         FAMILYINC NUMBER(5),
         FAMILYSIZE NUMBER(2),
         PGSIG1 VARCHAR2(1),
         PGNAME1 VARCHAR2(30),
         PGSIGDATE1 VARCHAR2(10),
         FAMSIZE1 VARCHAR2(2),
         FAMINC1 VARCHAR2(6),
         PGSIG2 VARCHAR2(1),
         PGNAME2 VARCHAR2(30),
         PGSIGDATE2 DATE,
         FAMSIZE2 VARCHAR2(2),
         FAMINC2 VARCHAR2(4),
         GRADE NUMBER(2),
         SCHOOL VARCHAR2(40),
         RACE VARCHAR2(4),
         MEDI_CALID VARCHAR2(15),
         MEDSTATUS VARCHAR2(1),
         ADULT1NAME VARCHAR2(40),
         ADULT1TYPE VARCHAR2(1),
         ADULT1INC1 VARCHAR2(5),
         ADULT1INC2 VARCHAR2(5),
         ADULT1INC3 VARCHAR2(5),
         ADULT1INC4 VARCHAR2(5),
         ADULT2NAME VARCHAR2(40),
         ADULT2TYPE VARCHAR2(1),
         ADULT2INC1 VARCHAR2(5),
         ADULT2INC2 VARCHAR2(5),
         ADULT2INC3 VARCHAR2(5),
         ADULT2INC4 VARCHAR2(5),
         ADULT3NAME VARCHAR2(40),
         ADULT3TYPE VARCHAR2(1),
         ADULT3INC1 VARCHAR2(5),
         ADULT3INC2 VARCHAR2(5),
         ADULT3INC3 VARCHAR2(5),
         ADULT3INC4 VARCHAR2(5),
         ADULT4NAME VARCHAR2(40),
         ADULT4TYPE VARCHAR2(1),
         ADULT4INC1 VARCHAR2(5),
         ADULT4INC2 VARCHAR2(5),
         ADULT4INC3 VARCHAR2(5),
         ADULT4INC4 VARCHAR2(5),
         ADULT5NAME VARCHAR2(40),
         ADULT5TYPE VARCHAR2(1),
         ADULT5INC1 VARCHAR2(5),
         ADULT5INC2 VARCHAR2(5),
         ADULT5INC3 VARCHAR2(5),
         ADULT5INC4 VARCHAR2(5),
         ADULT6NAME VARCHAR2(40),
         ADULT6TYPE VARCHAR2(1),
         ADULT6INC1 VARCHAR2(5),
         ADULT6INC2 VARCHAR2(5),
         ADULT6INC3 VARCHAR2(5),
         ADULT6INC4 VARCHAR2(5),
         AGE1LT19 VARCHAR2(1),
         AGE2LT19 VARCHAR2(1),
         AGE3LT19 VARCHAR2(1),
         AGE4LT19 VARCHAR2(1),
         AGE5LT19 VARCHAR2(1),
         AGE6LT19 VARCHAR2(1),
         MIDINIT1 VARCHAR2(1)
    organization external
    ( type oracle_loader
    default directory data_dir
    access parameters
    ( fields terminated by '','' )
    location (''DataJuly07.txt'')
    execute immediate ddl2;
    end;
    Error Received:
    ORA-29913:error in executing ODCIEXITTABLEFETCH callout ORA-30653: reject limit reached
    Please help ASAP. I am new with oracle and Apex. Any help will be appreciated greatly.

    I downloaded the External table simple application from the Apex Packaged application and installed it. It installed successfully but when I run it - it doesn't load the data eventhough I get a message stating it was successful. I am running it on OracleXE - Apex 3.0.1.
    In addition, I tried running the stored procedure directly (ext_employees_load) in sql-developer and received the ora-30648.
    Please help.
    thanks,
    Tom

  • Unable to load the data into HFM

    Hello,
    We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
    Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
    Please help us earliest..
    ** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
    Error:
    Code............-2147217873
    Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
       at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
       at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
    Procedure.......clsHPDataManipulation.fDBLoad
    Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
    Version.........1116
    Identification:
    User............fdmadmin
    Computer Name...EMSHALGADHYFD02
    FINANCIAL MANAGEMENT Connection:
    App Name........
    Cluster Name....
    Domain............
    Connect Status.... Connection Open
    Thanks,'
    Raam

    We are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
    As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
    There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
    As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal  is application name, 1 identifies the Scenario and 2013 the Year).  Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
    We are trying to load same data file with a replace option( it should delete the existing data before loading the data file).

  • Unable to load the data into Cube Using DTP in the quality system

    Hi,
    I am unable to load the data from PSA to Cube using DTP in the quality system for the first time
    I am getting the error like" Data package processing terminated" and "Source TRCS 2LIS_17_NOTIF is not allowed".
    Please suggest .
    Thanks,
    Satyaprasad

    Hi,
    Some Infoobjects are missing while collecting the transport.
    I collected those objects and transported ,now its working fine.
    Many Thanks to all
    Regards,
    Satyaprasad

  • Error Loading Transactional Data into Cube(0PCA_C01)

    Hi Guys,
        I am trying to Install the following cubes from Business Content. 0PCA_C01 / 0PCA_C02 (Profit Center Analysis). Everything got replicated. I am trying to load transaction data now. I created Infopackage and loaded the data. Its running for a long time. It still says " not yet completed/warning ". If i try to see the content of the cube, I am getting the following errors.
    "Your user master record is not sufficiently maintained for object Authorization Object3.
    System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED 0PCA_C01 0PCA_C01
    System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET 0PCA_C01 64
    System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET 0PCA_C01 64"
    Also if i try to change something in the infopackage, it says "Init. select. for field name  currently running in". I guess its because the job is still running.
    Please let me know if i missed something.
    Raj

    Hi Raj
    This seems to be an authorization case.
    i guess you are in BW Dev system.
    Go to SU01. Enter your user id and display.Go to Tabs ROle and PRofile.
    Ideally in development, a developers id shud have access to all devevlopment activities. So check with basis folks and get access to the relevant profiles. If you can get SAP_ALL, then perfect!!
    Prakash
    Assignin points is a way of saying thanks on SDN!!

  • Stage tab delimited CSV file and load the data into a different table

    Hi,
    I pretty new to writing PL/SQL packages.
    We are using Application express for our development. We get CSV files which is stored as a BLOB content in a table. I need to write a trigger that would get executed once the user the uploads the file and parse thru the Blob content and upload or stage the data in a different table.
    I would like to see if there is any tutorial or article that could explain the above process with the example or sample code to do the same. Any help in this regard will be highly appreciated.

    Hi,
    This is slightly unusual but at the same time easy to solve. You can read through a blob using the dbms_lob package, which is one of the Oracle supplied packages. This is presumably the bit you are missing, as once you know how you read a lob the rest is programming 101.
    Alternatively, you could write the lob out to a file on the server using another built in package called utl_file. This file can be parsed using an appropriately defined external table. External tables are the easiest way of reading data from flat files, including csv.
    I say unusual because why are you loading a csv file into a blob? A clob is almost understandable but if you can load into a column in a table why not skip this bit and just load the data as it comes in straight into the right table?
    All of what I have described is documented functionality, assuming you are on 9i or greater. But you didn't provide a version so I can't provide a link to the documentation ;)
    HTH
    Chris

Maybe you are looking for