Loading CLOB data to a flat file in owb 11.2.0.3

Hi,
Also,I have one more question.I want to load a table with clob data into a flat file using my owb client 11.2.0.3.But when I run this simple mapping ,I am getting this error:
ORA-06502:pl/sql:NUMERIC OR VALUE ERROR:CHARACTER STRING BUFFER TOO SMALL.
Please suggest what needs to be done.

I was up all night as I had a dead line doing this....
Turned out my Mapping was corrupted.
If you do a series of synchronizations by object name, position, id the mapping would end up corrupted.
I had to delete my staging table operator within the mapping, drag and drop and reconnect and now my loads work just fine..
Thanks

Similar Messages

  • Unload CLOB data to a flat file

    Hi,
    does anybody knows how to unload CLOB data to a flat file. Can u pl give a example. Is there any ulity in Oracle to unload data to a flat file. Whether we have to use the concatation operator for unloading the data to a flat file.
    help??????
    regards,
    gopu

    I don't know if there's an easy way, but you might want to try using the packages UTL_FILE and DBMS_LOB.
    Hope that helps.

  • Steps to load the data by using flat file for hierarchies in BI 7.0

    Hi Gurus,
    steps to load the data by using flat file for hierarchies in BI 7.0

    hi ,
    u will get the steps int he following blog by Prakash Bagali
    Hierarchy Upload from Flat files
    regards,
    Rathy

  • Loading Master Data from a Flat File

    Hi Guru, I have flat files of different SAP tables (Transactional data tables and Master Data ), I want to upload this flat files to a DSO, my question is in the case of Master Data Files example SAP table USR03 ( User Address Data) in a Flat file format which is better to create a Master data text and attributes Datasource or create it as a transactional data Datasource to upload it to a DSO.

    Hi
    Master data datasource should be used to load master data like customer details. customer data is the master data which does not change frequently. customer infoobject may have some attributes like customer name, customer contact number, customer address, etc. we have to maintain these attributes and text data.
    Transaction data is something which keeps changing very frequently like price of the material, sales, etc. for loading such data we go for transaction datasources.
    As mentioned above too,the changes in the address are very rare, therefore youshould not use transactional datasource.
    Regards, Rahul
    Edited by: Rahul Pant on Feb 7, 2011 12:24 PM

  • What is the best way to load and convert data from a flat file?

    Hi,
    I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
    The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
    What is the best and easiest way to archive this?
    Thanks,
    Carsten.

    Hi,
    thanks for your answers so far!
    I gave them a thought and came up with two different alternatives:
    Alternative 1
    I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
    The columns of the staging table have the target format (date, number).
    The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
    Alternative 2
    The columns of the staging table are all of type varchar2 regardless of the target format.
    I define data rules for all columns that require a later conversion.
    I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
    The rows that cannot be loaded go automatically into the error table.
    When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
    What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
    Further, I would prefer using expressions in the mapping for converting the data.
    What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
    I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
    As far as I know I need the data quality option for using data rules, is that true?
    Is there another alternative without any of these drawbacks?
    Otherwise I think I will go for alternative 1.
    Thanks,
    Carsten.

  • Step by Step details on how to load data from a flat file

    hi can anyone explain how to load data from a flat file. Pls giv me step by step details. thnx

    hi sonam.
    it is very easy to load data from flat file. whn compared with other extrations methods...
    here r the step to load transation data from a flat file.......
    step:1 create a Flat File.
    step:2 log on to sap bw (t.code : rsa1 or rsa13).
    and observe the flat file source system icon. i.e pc icon.
    step:3 create required info objects.
    3.1: create infoarea
         (infoObjects Under Modeling > infoObjects (root node)-> context menu -
    > create infoarea).
    3.2:  create char /keyfig infoObject Catalog.(select infoArea ---.context menu --->create infoObject catalog).
    3.3:   create char.. infoObj and keyFig infoObjects accourding to ur requirement and activate them.
    step:4 create infoSource for transaction data and create transfer structure and maintain communication structure...
        4.1: first create a application component.(select InfoSources Under modeling-->infosources<root node>>context menu-->create  applic...component)
       4.2: create infoSource  for transation data(select appl..comp--.context menu-->create infosource)
    >select O flexible update and give info source name..
    >continue..
    4.4: *IMp* ASSIGN DATASOURCE..
      (EXPAND APPLIC ..COMP..>EXPAND YOUR INFOSOURCE>CONTEXT MENU>ASSIGN DATASOURCE.)
    >* DATASOURCE *
    >O SOURCE SYSTEM: <BROWSE AND CHOOSE YOUR FLAT FILE SOURCE SYSTEM>.(EX:PC ICON).
    >CONTINUE.
    4.5: DEFINE DATASOURCE/TRANSFER STRUCTURE  FOR IN FOSOURCE..
    > SELECT TRANSFER STRUCTURE TAB.
    >FILL THE INFOOBJECT FILLED WITH THE NECESSARY  INFOOBJ IN THE ORDER OR SEQUENCE OF FLAT FILE STRUCTURE.
    4.6: ASSIGN TRANSFER RULES.
    ---> NOW SELECT TRANSFER RULES TAB. AND SELECT PROPOSE TRANSFER RULES SPINDLE LIKE ICON.
    (IF DATA TARGET IS ODS -
    INCLUDE 0RECORDMODE IN COMMUNICATION STRUCTURE.)
    --->ACTIVATE...
    STEP:5  CREATE DATATARGET.(INFOCUBE/ODS OBJECT).
    5.1: CREATE INFO CUBE.
    -->SELECT YOUR INFOAREA>CONTEXT MENU>CREATE INFOCUBE.
    5.2: CREATE INFOCUBE STRUCTURE.
    ---> FILL THE STRUCTURE PANE WILL REQUIRE INFOOBJECTS...(SELECT INFOSOURCE ICON>FIND UR INFOSOURCE >DOUBLE CLICK > SELECT "YES" FOR INFOOBJECT ASSIGNMENT ).
    >MAINTAIN ATLEAST  ON TIME CHAR.......(EX; 0CALDAY).
    5.3:DEFINE AND ASSIGN DIMENSIONS FOR YOUR CHARACTERISTICS..
    >ACTIVATE..
    STEP:6 CREATE UPDATE RULES FOR INFOCUDE USING INFOSOURCE .
    >SELECT UR INFOCUBE >CONTEXT MENU> CREATE UPDATE RULES.
    > DATASOURCE
    > O INFOSOURCE : _________(U R INFOSOURCE). AND PRESS ENTER KEY.......
    >ACTIVATE.....UR UPDATE RULES....
    >>>>SEE THE DATA FLOW <<<<<<<<----
    STEP:7  SCHEDULE / LOAD DATA..
    7.1 CREATE INFOPACKAGE.
    --->SELECT INFOSOURCE UNDER MODELING> EXPAND UR APPLIC.. COMP..> EXPAND UR INFOSOURCE..> SELECT UR DATASOURCE ASSIGN MENT ICON....>CONTEXT MENU> CREAE INFOPACKAGE..
    >GIVE INFOPACKAGE DISCREPTION............_________
    >SELECT YOUR DATA SOURCE.-------> AND PRESS CONTINUE .....
    >SELECT EXTERNAL DATA TAB...
    > SELECT *CLIENT WORKSTATION oR APPLI SERVER  >GIVE FILE NAME > FILE TYPE> DATA SAPARATER>
    >SELECT PROCESSING TAB
    > PSA AND THEN INTO DATATARGETS....
    >DATATARGET TAB.
    >O SELECT DATA TARGETS
    [ ] UPDATE DATATARGET CHECK BOX.....
    --->UPDATE TAB.
    O FULL UPDATE...
    >SCHEDULE TAB..
    >SELECT O START DATA LOAD IMMEDIATELY...
    AND SELECT  "START" BUTTON........
    >>>>>>>>>>
    STEP:8 MONITOR DATA
    > CHECK DATA IN PSA
    CHECK DATA IN DATA TARGETS.....
    >>>>>>>>>>> <<<<<<<<<----
    I HOPE THIS LL HELP YOU.....

  • Data misplaced during Flat file data load

    Hi all,
        i have to load the data from the flat file(txt format) but i found some record are not populating in PSA.
    the voluem of the data is very huge. how to resolve this and data load seems to be green in info package monitoring.

    Hi,
    How did you conclude that the records are missing the PSA?  Is it because the count is less?
    There are the chances that the records are duplicated in the flat file.
    Perhaps, you can increase the number of records per datapackage in the settings and can check the count.
    If you feel, a particular record is not getting loaded, the load that record selectively through filter.
    Split the load according to your convenience and load so that it will be easy to keep track of the records.
    Thanks.

  • How to load data from a  flat file which is there in the application server

    HI All,
              how to load data from a  flat file which is there in the application server..

    Hi,
    Firstly you will need to place the file(s) in the AL11 path. Then in your infopackage in "Extraction" tab you need to select "Application Server" option. Then you need to specify the path as well as the exact file you want to load by using the browsing button.
    If your file name keeps changing on a daily basis i.e. name_ddmmyyyy.csv, then in the Extraction tab you have the option to write an ABAP routine that generates the file name. Here you will need to append sy-datum to "name" and then append ".csv" to generate complete filename.
    Please let me know if this is helpful or if you need any more inputs.
    Thanks & Regards,
    Nishant Tatkar.

  • Error Loading Data into a flat file

    I am recieving the following error when loading data into a flat file from a flat file. SQL 2005 is my back end DB. If I cut the file iin half approx 500K rows my ODI interface works fine. Not sure what to look at.. I rebuit the interface which before was just dying giving no error and now I am getting this.
    Thanks
    java.lang.Exception
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)

    Figured it out, found similar post that stated changing the HEAP size
    Increase the page size in odiparams.bat in the bin folder and restart Designer.
    For eg:
    set ODI_INIT_HEAP=128m
    set ODI_MAX_HEAP=1024m

  • Error while uploading data from a flat file to the hierarchy

    Hi guys,
    after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
    regards
    Sri

    there is o relation of infoobject name in flat file and infoobjet name at BW side.
    please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
    now check the sequence of the objects in the transfer rules  and activate them.
    there u go.

  • Export data to a flat file

    Hi all,
    I need to export a table data to a flat file.
    But the problem is that the data is huge about 200million rows occupying around 60GB of space
    If I use SQL*Loader in Toad, it is taking huge time to export
    After few months, I need to import the same data again into the table
    So please help me which is the efficient and less time taking method to do this
    I am very new to this field.
    Can some one help me with this?
    My oracle database version is 10.2
    Thanks in advance

    OK so first of all I would ask the following questions:
    1. Why must you export the data and then re-import it a few months later?
    2. Have you read through the documentation for SQLLDR thoroughly ? I know it is like stereo instructions but it has valuable information that will help you.
    3. Does the table the data is being re-imported into have anything attached to it eg: triggers, indices or anything that the DB must do on each record? If so then re-read the the sqlldr documentation as you can turn all of that off during the import and re-index, etc. at your leisure.
    I ask these questions because:
    1. I would find a way for whatever happens to this data to be accomplished while it was in the DB.
    2. Pumping data over the wire is going to be slow when you are talking about that kind of volume.
    3. If you insist that the data must be dumped, massaged, and re-imported do it on the DB server. Disk IO is an order of magnitude faster then over the wire transfer.

  • Extract PSA Data to a Flat File

    Hello,
    I would like to download PSA data into a flat file, so I can load it into SQL Server and run SQL statements on it for analysis purposes.  I have tried creating a PSA export datasource; however, I can find way to cleanly get the data into a flat structure for analysis and/or download. 
    Can anyone suggest options for doing this?
    Thanks in advance for your help. 
    Sincerely,
    Sonya

    Hi Sonya,
    In teh PSA screen try pressing Shift and F8. If this does not bring up the file types then you can try the following: Settings > Chnage Display Variants > View tab > Microsoft Excel > Click SAP_OM.xls and then the green check amrk. The data will be displayed in excel format which you can save to your PC.
    Hope this helps...

  • Data from a Flat file int a Cube

    Hi experts!
    Just a quick one, could we load data from a flat file directly to an infocube?
    Or we woud need to create a ODS to load that from the flat file there and later on form the ODS to the cube?
    Thanks you very much for your time!!

    Hi,
    You can directly load the flat data into the info cube. This should not be a problem.
    Create a flat file datasource according to the structure of the flat file. Create Transformations and DTP to the Cube.
    Load the flat file data into the PSA and then DTP the request into the cube.
    If the flat file load is a full load and one time load and data has to be deleted and loaded on the next load then above approach is fine.
    But if your load is every day load and delta then it would be appropriate to have a DSO in between the data flow.
    Hope it helps.

  • Rejecting the Data to a flat file

    Hi,
    I want to reject the bad data to a flat file. For example, for primary key violation, the bad data should be loaded to a flat file, instead of giving the error and stopping my mapping job (after the error limit of 50).
    Can I do this? Please do help me.
    Thanks,
    Harsha

    Hi,
    what is your actual requirement?
    If you do not want the mapping to fail after raising the 50 errors, then configure your mapping and increase the value for Maximum number of errors tab under runtime parameters.
    If you actually want to capture all the PK violations into a flat file, there is no direct way of doing this.
    Either you have to query RT repository views (write your own sql) to produce a file OR
    While in Mapping Configure window
    expand Sources and Targets
    expand constraint management
    set Enable constraints to FALSE
    set Exceptions Table Name <your own exceptions table>.
    Have a look in Oracle Documentation for constraint management. That should give you an idea how to create your own exceptions table etc.
    HTH
    mahesh

  • Error while uploading data from a flat file

    Hi All,
    I am trying to load data from a flat file to an ODS. The flat file contains a field, say FLAT01, of numerical type(Type Decimal, Length 18, Decimals 2). I have created an infoobject, say ZIO10, with type CURR and currency field, 0CURRENCY. In the transfer rules, I have assigned a constant for 0CURRENCY as the flat file doesn't have any currecy field.
    The problem is the flat file doesn't have any value in that field, FLAT01, for some records, infact most of the records. When I try to load it to BW,from applicatin server, it is throwing an error "Contents from field ZIO10 cannot be converted in type CURR ->longtext".
    I have debugged the transfer rules and find that the empty value is being represented as '#' and when it is trying to copy it into the CURR type field, it is throwing the error.
    Can someone please let me know how to solve it? why is it taking '#' for enpty value? Can't we change that?
    Any help would be highly appreciated.
    Best Regards,
    James.
    Message was edited by:
            James Helsinki

    Hi SS,
    I am sorry that I was not clear with my explanation. I have created ZIO10 as a keyfigure with type AMOUNT and the currency as 0CURRENCY.
    There is no currency coming in from the flat file so I used a constant in transfer rules to fill 0CURRENCY.
    The field which is coming as '#' is FLAT01 which is assigned to ZIO10.
    Best Regards,
    James.

Maybe you are looking for