Data load help

hi all,
i have a csv file, and i need to handle data of that csv file in pl/sql, how to select a part of a string before the ',' and place it into different columns of the table.......so that i can handle that that individually.......i know that we can do this using sql * loader, but i need to do it without using it.....
can you any please suggest me the ways/codes to do this
Many thanks

Perhaps EXTERNAL TABLES;
http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14220/schema.htm#i36180

Similar Messages

  • Data loading help

    Hello Gurus,
    I have applications 11,12 and 13 datasource feeding to infocube, now application 11, 12 datasources are also feeding to seprate ODS as well.
    data is not correct in application 11,12 in header and item for month of Jan-march 06 but it is correctly loaded from april onwards.
    so I want to correct this data.
    what is my solution  ???
    1)should I do complete fresh load (filling set up tables) for application 11,12, (as both of these have problems)or will this action effect on data consistancy
    should I also do the fresh load for application 13 also ??
    2) I am loading data through PSA so is there a way I can delete the selective request and load it from the source system (as data is defective in PSA as well) OR is there a way to fill the set up tables for order which were created in period from Jan to march ?
    your help is highly appreciated
    I will assing full points to correct answers and
    points to all the answers.
    Krish kumar
    [email protected]

    Hi,
    If you perform the selective load (infopacakge with full update but with selection criteria) your delta will still work.
    If you do a full pull without any selection criteria, you'll need to do that as a repair request.
    Suggest you try the first scenario in dev or QA before executing in Production so that you'll have proof of the concept.
    Cheers,
    Kedar

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • EIS Member and Data Load-Getting OS Error-Please help!

    Please help! I have created a OLAP model then created a Metaoutline.
    Then I went ahead to do the Member and Data load. I logged into my server and started the member and data load.
    Then it gives me the following error:
    SELECT /*+ */ .. FROM <my_view_name>
    OS Error No such file or directory IS Error Member load terminated with error.
    The load terminated with errors.
    Thanks in advance for any replies.

    thanks all! the error has been resolved.
    Jus had to create a directory in the Integration services folder: $ISHOME/loadinfo
    the loadinfo folder was missing.
    Prathap,
    Is that view available at that time? --the query is generated automatically.
    Which Data Source and which version of the Hyperion - datasource is Oracle10g and 9.3 is Hyp version.

  • Need help on: Automation of Daily Data Load

    Hi all,
    We need to start our Daily Data load from DAC by Manually. So right now my client has asked us to do Automation of Daily Data Load.
    Starting the Daily Data Load Manually(DAC) Process: First we have to check whether the ASCP Plans updated or not
    Right now we are checking whether the plans got updated or not, so for this we are using following query
    SELECT LTrim(RTrim (compile_designator)),data_completion_date,TO_CHAR(data_completion_date ,'DD-MON-YYYY HH24:MI:SS') FROM apps.msc_plans
    WHERE LTrim(RTrim (compile_designator))
    in( 'Plan01,'Plan02','Plan03','Paln04') ORDER BY 2 desc
    from this query we will able to see whether all the plans got updated or not. From all the Four Plans, two plans will get updated as of Sysdate(mm/dd/yyy) ,Timestamp(hh:mm:ss)(for example i.e. Plan01 08/25/2011 11:20:08 PM, Plan02 08/25/2011 11:45:06 PM) and rest two plans get updated on Sysdate+1(mm/dd/yyy), Timestamp(hh:mm:ss)(for example i.e. Plan03 08/26/2011 12:20:05 AM, Plan04 08/26/2011 12:45:08 AM)
    So after checking the plans , we start the Daily Load in DAC manually.
    May I know how should I convert my above sql query which I am using for checking the plans updated or not in informatica, so as to automate the Daily Load in informatica level..
    Need help.

    You cannot replicate what is done with DAC at Informatica level. DAC is a separate Oracle product that orchestrates and manages the ETL load (including Index management, etc). The reason Oracle developed DAC is because it allows you to manage a large scale DW load for a large ERP system. As suggested, you can invoke the DAC execution plan via a command but you cannot replicate everything the DAC does at Informatica level. If this helps, please mark as helpful.

  • Help required in data loads

    Hi all,
    We have actually installed some patch in R/3 system. so now we are testing BW3.5 system  whether the full and delta loads are working or not.
    Full loads are not having any problem but few delta loads are having problems . what we did is -
    1) Re- initialize the delta loads .
    2) Put/ change  some records in R/3 system
    3) Schedule delta loads in BW .
    But 0 records are coming in BW  though in R/3 the records are present.
    Kindly help.

    Hi
    You mentioned that no recorde are moved to BW.
    1)  Did you  check in R/3 side weather the data is in RSA7 or not?.
    2)  If not check in SM13 (V3) and lBWQ(Qued Delta).Activate the data source.
    3)  Replicate the datasource in BW and
    4)  Run the Program RS_TRANSTRU_ACTIVATE_ALL in SE38.
    5)  Perform data Loading.
    You can get data in this way.
    I think this will help you.
    Regards,
    Siva

  • Help Required regding: Validation on Data Loading from Flat File

    Hi Experts,
    I need u r help in the following issue.
    I need to validated the transactional data loading to the GL Cube from Flat file,
    1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>“0GL_ACCOUNT”</b> info object.
    2) If the master data record does not exits then the record need to be skipped from the loading and after the loading  the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
    I would really appriciate u r help and suggestions on solving this issue.
    Regds
    Hari

    Hi, write a <b>start routine</b> in transfer rules like this.
      DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
            l_s_errorlog TYPE rssm_s_errorlog_int,
            <b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
            new_datapak type tab_transtru.
           refresh new_datapak.
           loop at datapak into l_s_datapak_line.
           select single * from /BI0/PGLACCOUNT into l_s_glaccount
             where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
    and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
    and OBJVERS eq 'A'.
           if sy-subrc eq 0.
             append l_s_datapak_line to new_datapak.
           endif.
           endloop.
           datapak = new_datapak.
           if datapak[] is initial.
    abort <> 0 means skip whole data package !!!
             ABORT = 4.
           else.
             ABORT = 0.
           endif.
    i have already some modifications but U can slightly change it to suit your need.
    regards
    Emil

  • GL Account data load - Struggle , Pls help

    Hi all ,
    I am mapping the standard infoObject 0GL_ACCOUNT
    to an R/3 field of type HKONT . When i try to
    activate the data loaded in ODS , I get an error
    saying no SID exists for '   0003223311'. when i check
    the contents in the ODS , I can see the GL Account
    '0003223311' . In R/3 table ( data source ) , I see
    the GL account as '0003223311' when i select Standard
    list in user parameters . However when i tick option ,
    'Check conversion exits' , I see the GL Acct as '3223311   '.
    Please help me resolve this issue .
    Thanks all

    when i look at the cost center SID table , I see that the key for the table is a combination of Cost Center and
    Controlling area . Does that mean that the corressponding SIDs are for the combination value ? for ex : if in
    the cost center table i have a sid of 'xxxx' for cost center '9999' and co area 'abc' combination .
    Now i have an ODS which has only cost center(9999) filled , therefore when i try to activate the data in the ODS
    it tells me 'SID not found for cost center 9999' , Is that because the key to the SID table is a combo of Cost cen and Co area. Do i need to have both populated in my ODS ?

  • Help with data load model

    Hi,
    I need help with a data load model. First, i'm doing delta extraction from R/3, we load data with a InfoSource to InfoCube A and InfoCube B.
    I'm doing master data validation on the load, so if a load fails for InfoCube A, it fails for InfoCube B too (this is because i can have only 1 InfoPackage for the 2 infocubes, because of the delta update).
    So i propose a new model in wich:
    - The delta load is taked first to an ODS.
    - ODS is cleaned before the delta update.
    - Then i create 2 InfoPackages for full load from ODS to  Infocube A, and from ODS to InfoCube B.
    With this solution i can have 2 infopackages from ODS because i'm not doing a delta load from here to the cubes, and with 2 infopackages i can have independent validations for each cube so if one of them fails, the other can still be loaded sucessfully.
    The solution fails because if i load delta from R/3 to the ODS i can't clean it first. The initialization and the old updates needs to be previuslly loaded on the ODS. Then i can't do full load to the cubes and neither have 2 infopackages.
    Please help me to solve this issue.
    thanks a lot

    Hi jeremy,
    what about this simple solution:
    load data by delta from R/3 in your ODS. You can also have an ODS/cube for the historical data which is more space-saving than holding all the old data in PSA. Then you load your historical data from PSA into the historical ODS/cube.
    From your ODS with the actual data you update your requests by full from ODS into the cubes with 2 different full infopackages. But because you load by full you have to use deletion selections in the infopackages to avoid duplicate data!
    regards,
    Jürgen

  • Urgent help require-ASO data loading

    I am working on 9.3.0 and want to do incremental data loading so that it wont affect my past data.I am still not sure that it is doable or not .Now the question is
    do I need to design a new aggragation after loading data.
    Thanks in advance

    Hi,
    The ASO cube will clear off all the data , if you make any structural changes to the cube ( i.e if you change your outline ,and you can find out what exactly can clear off hte data ,for example , if you add a new member ,then ,it clears off , and in other case ,if you just add a simple formula to the existing member , it might not clear off the data ).
    If you dont want to effect the past data ,and yet you want to incrementally load , then ensure that all the members are already in the otl( take care of time dimension , let all the time members be there in place) and then you can load in ,using the option of 'add to existing values' while you load .
    But remember , you can only do ,without any structural changes. Else , you need to load all together again
    Its good if you design aggregation , as it helps in retrieval performance, but its not mandatory.
    Sandeep Reddy Enti
    HCC

  • Need Help about Spatial Data Load - Mapviewer

    Hello everybody,
    I need an immediate help about spatial data load. I installed Oracle mapviwer quick start and try to work on it. However, I could not pass the load step. My questions are;
    1- Where can I find and download my country's data set (spatial data)
    2- With mapviwer, how can I load spatial data to my tables on database (Oracle). Those tables have sdo_geometry columns and I want to query location data, but could not load
    Regards,
    Dilek

    For Mapviewer questions, please post in the following forum:
    MapViewer
    Thanks

  • Need help in deleting the data load request

    Hi All,
    In the system one erroneous data load load request is being generated( whoes status is  monitor without Request), because of this I am unable to activate other requests...
    I want to delete this request, I am deleting it but it is not, it is still remain in the system, and i am unable to activate it as it is an erroneous one, I have deleted that request from the backend table RSMONICDP as well, but still no successful results.
    Please give ur suggestions on this.
    Thanks,
    vinay

    Hi
    There is a possible solution for these kind of issues .
    1. Note that bad request number
    2. Go to SE16
    3. Open table RSODSACTREQ.: Filter the contents by that request number . There will be one entry for that request . Delete that entry
    4. Open table RSICCONT. There will be one entry for that request . Delete that entry
    5. Open Table RSMONICDP. Filter the contents by that request number . There will be 3-4 entries for that request depending on number of data packages in that request .Delete all those entries

  • Is the Data load option in 4.1 really helpful?

    Hi,
    So while trying to create a data load page in my application, I was facing issues with no data found and was able to resolve the issue at that time but that raised another question which I thought of posting as a separate question rather than in an existing thread.
    I was trying to load data into a DB table from a .csv file. and at the datavalidation page/step, it would throw a "no data found" error. I then noticed that . I then noticed, that in the second data/table mapping page where it shows all the rows. on top there is a "Column Name" list box which has values defaulted to "Do not load" and in that there are all the column names.
    So just to confirm, do we need to to everytime select all the columns ? In the very first step, the column name is specified in the first row of the csv file, which I have selected .We have quite a few columns and this would be tedious for business to manually do? Shouldn't the application just use the csv first row headers as the column name?
    How do you guys workaround this?
    Thanks,
    Sun

    Hi VC,
    It does take the excel column values on the first row. The problem was that the column value had a space at the end. hence the issue. Was able to fix that.
    Thanks,
    Sun

  • Need help with Rollback data if there is error in Data load

    Hi All,
    We are trying to load data to Oracle 11g database. We want a trigger, procedure or something like that to rollback the data if there are errors in load. Is it possible to do rollback after all the records has been parsed ? So if we try to load 100 records and if there are 30 records with error, we want to rollback after all the 100 records are parsed.
    Please advice.

    >
    Thanks for the suggestion. I'll try that option. So currently we are only loading data that is validated and erroneous records are rejected using trigger. So we don't get any invalid data in table. But Now users are saying that all the records should be rejected if there is even one error in the data load.
    >
    I generally use a much simpler solution for such multi-stage ETL processes.
    Each table has a IS_VALID column that defaults to 'N'. Each step of the process only pulls data with the flag set to 'Y'.
    That allows me to leave data in the table but guarantee that it won't be processed by subsequent stages. Since most queries that move data from one stage to another ultimately have to read table rows (i.e. they can't just use indexes) it is not a performance issue to add a predicate such as "'AND IS_VALID = 'Y'" to the query that accesses data.
    1. add new unvalidated data - automatically flagged an invalid by default of 'N" on IS_VALID column
    2. run audit step #1 - capture the primary key/rowid of any row failing the audit.
    3. run audit steps #2 through #n - capture error row ids
    4. Final step - update the data setting IS_VALID to 'Y' only if a row passes ALL audits: that is, only if there are NO fatal errors for that row captured in the error table.
    That process also allows me to capture every single problem that any row has so that I can produce reports for the business users that show everything that is wrong with the data. There are some problems that the business wan't to ignore, others that can be fixed in the staging tables then reprocessed and others that are rejected since they must be fixed in the source system and that can take several days.
    For data that can be fixed in the staging tables the data is fixed and then the audit is rerun which will set the IS_VALID flag to 'Y' allowing those 'fixed' rows to be included in the data that feeds the next processing stage.

  • Data load problem - BW and Source System on the same AS

    Hi experts,
    I’m starting with BW (7.0) in a sandbox environment where BW and the source system are installed on the same server (same AS). The source system is the SRM (Supplier Relationship Management) 5.0.
    BW is working on client 001 while SRM is on client 100 and I want to load data from the SRM into BW.
    I’ve configured the RFC connections and the BWREMOTE users with their corresponding profiles in both clients, added a SAP source system (named SRMCLNT100), installed SRM Business Content, replicated the data sources from this source system and everything worked fine.
    Now I want to load data from SRM (client 100) into BW (client 001) using standard data sources and extractors. To do this, I’ve created an  InfoPackage in one standard metadata data source (with data, checked through RSA3 on client 100 – source system). I’ve started the data load process, but the monitor says that “no Idocs arrived from the source system” and keeps the status yellow forever.
    Additional information:
    <u><b>BW Monitor Status:</b></u>
    Request still running
    Diagnosis
    No errors could be found. The current process has probably not finished yet.
    System Response
    The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
    and/or
    the maximum wait time for this request has not yet run out
    and/or
    the batch job in the source system has not yet ended.
    Current status
    No Idocs arrived from the source system.
    <b><u>BW Monitor Details:</u></b>
    0 from 0 records
    – but there are 2 records on RSA3 for this data source
    Overall status: Missing messages or warnings
    -     Requests (messages): Everything OK
    o     Data request arranged
    o     Confirmed with: OK
    -     Extraction (messages): Missing messages
    o     Missing message: Request received
    o     Missing message: Number of sent records
    o     Missing message: Selection completed
    -     Transfer (IDocs and TRFC): Missing messages or warnings
    o     Request IDoc: sent, not arrived ; Data passed to port OK
    -     Processing (data packet): No data
    <b><u>Transactional RFC (sm58):</u></b>
    Function Module: IDOC_INBOUND_ASYNCHRONOUS
    Target System: SRMCLNT100
    Date Time: 08.03.2006 14:55:56
    Status text: No service for system SAPSRM, client 001 in Integration Directory
    Transaction ID: C8C415C718DC440F1AAC064E
    Host: srm
    Program: SAPMSSY1
    Client: 001
    Rpts: 0000
    <b><u>System Log (sm21):</u></b>
    14:55:56 DIA  000 100 BWREMOTE  D0  1 Transaction Canceled IDOC_ADAPTER 601 ( SAPSRM 001 )
    Documentation for system log message D0 1 :
    The transaction has been terminated.  This may be caused by a termination message from the application (MESSAGE Axxx) or by an error detected by the SAP System due to which it makes no sense to proceed with the transaction.  The actual reason for the termination is indicated by the T100 message and the parameters.
    Additional documentation for message IDOC_ADAPTER        601 No service for system &1, client &2 in Integration Directory No documentation exists for message ID601
    <b><u>RFC Destinations (sm59):</u></b>
    Both RFC destinations look fine, with connection and authorization tests successful.
    <b><u>RFC Users (su01):</u></b>
    BW: BWREMOTE with profile S_BI-WHM_RFC (plus SAP_ALL and SAP_NEW temporarily)
    Source System: BWREMOTE with profile S_BI-WX_RFCA (plus SAP_ALL and SAP_NEW temporarily)
    Someone could help ?
    Thanks,
    Guilherme

    Guilherme
    I didn't see any reason why it's not bringing. Are you doing full extraction or Delta. If delta extraction please check the extractor is delta enabled or not. Some times this may cause problems.
    Also check this weblog on data Load errors basic checks. it may help
    /people/siegfried.szameitat/blog/2005/07/28/data-load-errors--basic-checks
    Thanks
    Sat

Maybe you are looking for