Loading data from a CLOB Parameter

Hi,
we have a front end app, in .NET, that passes a file has a CLOB Parameter to an Oracle procedure. What is the beast way to load the content of the CLOB Parameter into an Oracle table? Can SQL LDR be used in this case? If not, how about the UTL FILE utility?
Thanks in advance,
Marc

LAF wrote:
we have a front end app, in .NET, that passes a file has a CLOB Parameter to an Oracle procedure. What is the beast way to load the content of the CLOB Parameter into an Oracle table? Can SQL LDR be used in this case? If not, how about the UTL FILE utility?Both of these, SQL*Loader and UTL_FILE, works on external files in an o/s file system. The CLOB is a column value in a row in a database table.
If the CLOB is XML for example, that can be directly parsed into a XML DOM in Oracle. Other formats require one of two basic approaches.
Write a parser in PL/SQL to read the CLOB line by line, tokenise and parse it, and transform it into whatever structured output required. A pipe line table function can be used for this approach, allowing the process and transformation PL/SQL function to be treated as if it is a SQL table that SQL select can be used against.
The 2nd option is to unload the CLOB into an actual external o/s file and running whatever process against it to read, parse and process the file.
The basic approach that suits the requirement the best, depends entirely on the requirement itself.

Similar Messages

  • DTP load data from DSO to CUBE can't debug

    Hi,all expert.
    I met a problem yestoday.
    When I load data from datasource to DSO,and debug the start routine,it was ok.But when I load data from DSO to CUBE,and debug the routine,it got an error like this:
    inconsistent input parameter(parameter:<unknow>,value <unknown>)
    message no: RS_EXCEPTION 101.
    I don't know why,PLZ help me,thank you very much.

    Hi,
    An alternative way to do this is to have a small code for infinite loop in your start routine. And when you start the load goto SM50 and debug the process which is executing the load, from there you can close the infinite loop and carry on with the debugging. Please let me know if you require further information.
    Thanks,
    Arminder

  • Problem in Loading Data from SQL Server 2000 to Oracle 10g

    Hi All,
    I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
    21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
    Where Prj_Dim_Region is the name of my target table in Oracle.
    The error message is:
    955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
    java.sql.SQLException: ORA-00955: name is already used by an existing object
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
         at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
         at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
         at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
         at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.i(e.java)
         at com.sunopsis.dwg.cmd.g.y(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
    <%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
    I have chosen to execute this code on target.
    Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
    When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
    select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
    But when i press OK to run it, I get this error message:
    java.lang.IllegalArgumentException: Row index out of range
         at javax.swing.JTable.boundRow(Unknown Source)
         at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
         at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
         at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
         at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.a(Unknown Source)
         at com.borland.dx.dataset.DataSet.open(Unknown Source)
         at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
         at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
         at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
         at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
         at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
         at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
         at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
         at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
         at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
         at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
         at java.awt.Component.processMouseEvent(Unknown Source)
         at java.awt.Component.processEvent(Unknown Source)
         at java.awt.Container.processEvent(Unknown Source)
         at java.awt.Component.dispatchEventImpl(Unknown Source)
         at java.awt.Container.dispatchEventImpl(Unknown Source)
         at java.awt.Component.dispatchEvent(Unknown Source)
         at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
         at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
         at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
         at java.awt.Container.dispatchEventImpl(Unknown Source)
         at java.awt.Window.dispatchEventImpl(Unknown Source)
         at java.awt.Component.dispatchEvent(Unknown Source)
         at java.awt.EventQueue.dispatchEvent(Unknown Source)
         at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
         at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
         at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
         at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
         at java.awt.EventDispatchThread.run(Unknown Source)
    I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
    Thank you so much in advance.
    Neel

    Hi Cezar,
    Can u plz help me with this scenario?
    I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
    Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
    java.sql.SQLException: ORA-00942: table or view does not exist
    and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
    "-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
    Please let me know how to make it work?
    Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
    Thank you for your time in advance.
    Regards,
    Neel

  • Load data from File on Client side (via Sqlplus)

    Server OS: RedHat, Oracle 10g rel 2.
    I am trying to load data from OS .txt files to clob field.
    I am able to do this successfully using:
    Oracle DIRECTORY
    BFILE
    DBMS_LOB.loadclobfromfile packageIssue is: this only works if my files and DIR are on database server.
    Is it not possible "load clob from file" from client side, files being on client and exec command via SQLPlus. Is there any other option to load data from client side.
    Thanks for any help.

    Options:
    1) Search for OraDAV
    2) Learn about Application Express.

  • SQL loaded not loading data from csv format to oracle table.

    Hi,
    I trying to load data from flat files to oracle table,it not loading decimal values.Only character type of data is loaded and number format is null.
    CONTROL FILE:
    LOAD DATA
    INFILE cost.csv
    BADFILE consolidate.bad
    DISCARDFILE Sybase_inventory.dis
    INSERT
    INTO TABLE FIT_UNIX_NT_SERVER_COSTS
    FIELDS TERMINATED BY ',' optionally enclosed by '"'
    TRAILING NULLCOLS
    HOST_NM,
    SERVICE_9071_DOLLAR FLOAT,
    SERVICE_9310_DOLLAR FLOAT,
    SERVICE_9700_DOLLAR FLOAT,
    SERVICE_9701_DOLLAR FLOAT,
    SERVICE_9710_DOLLAR FLOAT,
    SERVICE_9711_DOLLAR FLOAT,
    SERVICE_9712_DOLLAR FLOAT,
    SERVICE_9713_DOLLAR FLOAT,
    SERVICE_9720_DOLLAR FLOAT,
    SERVICE_9721_DOLLAR FLOAT,
    SERVICE_9730_DOLLAR FLOAT,
    SERVICE_9731_DOLLAR FLOAT,
    SERVICE_9750_DOLLAR FLOAT,
    SERVICE_9751_DOLLAR FLOAT,
    GRAND_TOTAL FLOAT
    In table FLOAT are replaced by number(20,20)
    SAmple i/p from csv:
    ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
    Only abos12 is loaded and rest of the number are not loaded.
    Thanks.

    Hi,
    Thanks for your reply.
    Its throwing error.
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
    Insert option in effect for this table: INSERT
    TRAILING NULLCOLS option in effect
    Column Name Position Len Term Encl Datatype
    HOST_NM FIRST 255 , O(") CHARACTER
    SERVICE_9071_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9310_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9700_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9701_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9710_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9711_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9712_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9713_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9720_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9721_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9730_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9731_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9750_DOLLAR NEXT 255 , O(") CHARACTER
    SERVICE_9751_DOLLAR NEXT 255 , O(") CHARACTER
    GRAND_TOTAL NEXT 255 , O(") CHARACTER
    value used for ROWS parameter changed from 64 to 62
    Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 2: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 3: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 4: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 5: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 6: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 7: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 8: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 9: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 10: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Record 11: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
    ORA-01722: invalid number
    Please help me on it.
    Thanks,

  • Error when trying to load data from ODS to CUBE

    hi,
      Iam getting a short dump  when trying to load data from ODS to CUBE. The Run time error is 'TYPELOAD_NEW_VERSION' and the short text is 'A newer version of data type "/BIC/AZODS_CA00" was found than one required.please help me out.

    Hi,
    Check this thread.........Ajeet Singh  has given a good solution here.........
    Re: Error With Data Load-Getting Canceled Right Away
    Also check SAP note: 382480..................for ur reference............
    Symptom
    A DART extraction job terminates with runtime error TYPELOAD_NEW_VERSION and error message:
    Data type "TXW_INDEX" was found in a newer version than required.
    The termination occurs in the ABAP/4 program "SAPLTXW2 " in "TXW_SEGMENT_RECORD_EXPORT".
    Additional key words
    RTXWCF01, LTXW2U01, TXW_INDEX
    Cause and prerequisites
    This problem seems to happen when several DART extraction jobs are running in parallel, and both jobs access table TXW_INDEX.
    Solution
    If possible, avoid running DART extractions in parallel.
    If you do plan to run such jobs in parallel, please consider the following points:
    In the DART Extract configuration, increase the value of the parameter "Maximum memory allocation for index (MB)" if possible. You can estimate reasonable values with the "File size worksheet" utility.
    Run parallel DART jobs on different application servers.
    As an alternative, please apply note 400195.
    It may help u.........
    Regards,
    Debjani.......

  • Dump while loading data from POS DM to data source

    Hello experts
    I'm loading data from POS DM to data source 0RT_PA_TRAN_CONTROL, when amount of data is more then 1800000 records i get dump with the message "Unable to fulfil request for 8004 bytes of memory space."
    How can i divide data on small packets? or can you suggest other solution?
    I already check ztta/max_memreq_MB parameter and it's set to maximum.

    Check the parameter ztta/roll_area in RZ11 , changes to this parameter would affect all the datasources.
    The Maximum recommended size of a datapackage is 100.000 records and If a delta load is big, then divide the package into smaller using the parameter MAXDPAKS.
    check this note for more info 1292059.
    Also check below notes
    1102641 Consult: Dump during insert in PSA table due to incorrect values
    1163359 Load methods using SMQS or SAPI-controlled to transfer to BW
    1160555 Determination of maximum data packages in a delta request
    Points are appreciated.

  • Loading data from a remote machine

    Hi,
    I would like to know the process to load data from a file in remote machine to an oracle table. The load file is dumped to a folder in a remote machine and it is a pipe delimited one. I am using oracle 11g.
    Can I create a directory in oracle, which maps to the remote location, and create an external table?
    Thanks in advance
    Arun

    The best method would depend on the requirements. Are files regularly created on that server? What are the file sizes? Are files to be kept after loading for history or auditing purposes? Etc.
    You can run SQL*Loader on the remote platform and load the data, across the network, into the database.
    However, if the data set is large, it may be faster to zip that data, scp it across to the Oracle server, and then unzip and load it there.
    If the remote platform is a user platform and the user needs to load data - then that can be done via a web interface. User uploads the file to Oracle (via http). The file is received as a CLOB by the database. PL/SQL code can then process that file - either by parsing it directly, or unloading the CLOB to a local file and then loading that file via an external table.
    You'll need to look at what the requirements are, before looking for a solution.

  • Loading data into a CLOB column

    I need to find out how to load about ten sentences of data into a clob column for a table in the database. I have a pl/sql procedure that loads data from an xml file into various tables in the the database. Recently, we added a column (test_dummy) to one of the tables and defined it as a CLOB. There is a corresponding node (detail_info) in the XML file that maps to this column. I need to figure out how to incorporate this in the pl/sql procedure so that the data in the XML file for the node (detail_info) is loaded into "test_dummy". Any ideas?

    Take it one at a time. Use 'extract' function to extract an XML snippet from a given XML. The question couldn't be more vague. Maybe an example would help?
    Rahul

  • Special character issue while loading data from SAP HR through VDS

    Hello,
    We have a special character issue, while loading data from SAP HR to IdM, using a VDS and following the standard documentation: http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e09fa547-f7c9-2b10-3d9e-da93fd15dca1?quicklink=index&overridelayout=true
    French accent like (é,à,è,ù), are correctly loaded but Turkish special ones (like : Ş, İ, ł ) are transformed into u201C#u201D in Idm.
    The question is : does someone know any special setting to do in the VDS or in IdM for special characters to solve this issue??
    Our SAP HR version is ECC6.0 (ABA/BASIS7.0 SP21, SAP_HR6.0 SP54) and we are using a VDS 7.1 SP5 and SAP NW IdM 7.1 SP5 Patch1 on oracle 10.2.
    Thanks

    We are importing directly to the HR staging area, using the transactions/programs "HRLDAP_MAP", "LDAP" and "/RPLDAP_EXTRACT", then we have a job which extract data from the staging area to a CSV file.
    So before the import, the character appears correctly in SAP HR, but by the time it comes through the VDS to the IDM's temporary table, it becomes "#".
    Yes, our data is coming from a Unicode system.
    So, could it be a java parameter to change / add in the VDS??
    Regards.

  • Unable to load data from an ODS to a Cube

    Hi all,
    I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
    I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
    Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
    Thank you very much and kind regards,
    MM.
    May be this helps...
    When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
    Thanks again.

    Hello MM,
    Can you do the following..
    make the Total status Red Delete the request from cube.
    goto ODS -> Remove the Data Mart Status -> Try loading it again.
    The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
    Hope it will help!

  • To load data from a cube in SCM(APO) system to a cube in BI system.

    Experts,
         Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
    Thanks,
    Meera

    Hi,
    Think in this way,
    To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO,  then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
    Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
    Thanks
    Reddy

  • ERROR :  "Exceptions in Substep: Rules" while loading data from DSO to CUBE

    Hi All,
    I have a scenario like this.
    I was loading data from ODS to cube daily. everything was working fine till 2  days before. i added some characteristic fields in my cube while i didn't do any change in my key figures. now i started reloading of data from ODS to cubes. i got the following error  "Exceptions in Substep: Rules" after some package of data loaded successfully.
    I deleted the added fields from the infocube and then i tried toreload data again but i am facing the same error again  "Exceptions in Substep: Rules".
    Can any one tell me how to come out from this situation ?
    Regards,
    Komik Shah

    Dear,
    Check some related SDN posts:
    zDSo Loading Problem
    Re: Regarding Data load in cube
    Re: Regarding Data load in cube
    Re: Error : Exceptions in Substep: Rules
    Regards,
    Syed Hussain.

  • Unable to load data from DSO to Cube

    Good morning all,
    I was trying to load data from DSO to Cube for validation. Before loading the new data, I deleted all from DSO and Cube. They contain no request at all. Cube has "Delta Update". First DSO was loaded 138,300 records successfully. Then Activated the DSO. The last when I clicked Exectue (DSO --> Cube), it loaded 0 record. I was able to load the data yesterday. What might be the reasons for this situation?
    Thank you so much!

    Hi BI User,
    For loading delta upload into the data target, there should be an initialization request in the data target with same selection criteria.
    So..first do the initialization and perform delta upload into the cube.
    Regards,
    Subhashini.

  • Error when loading data from DSO to Cube

    Hi,
    After upgrading to 2004s we get the following error message when trying to load data from a DSO to a Cube:
    Unexpected error: RSDRC_CONVERT_RANGES_TO_SELDR
    Has anyone experienced a similar problem or have some guidelines on how to resolve it?
    Kind regards,
    Fredrik

    Hej Martin,
    Tack!
    Problem solved
    //Fredrik

Maybe you are looking for