Importing Azure table Data to Hive

Hi All,
Have created a table in Azure. Can see the same in Azure Storage explorer under tables section.
I want to import this table's data to Hive.
For the blob/csv/text files we will use the syntax
CREATE EXTERNAL TABLE <tablename>(col1  string , col2 string)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
LOCATION '/xxxx/xxxx/filename';
what's the syntax for importing the Azure table to Hive.
Please help.
Sriman N Vangala

Hi Sriman,
Unfortunately, Microsoft does not have a Azure table storage handler for HDInsight. It is in the roadmap for future but there is no specific dates on when it will be publically available. One of our product team members has a blog on how to achieve
this using a connector available from github. Please refer to that blog here:
http://blogs.msdn.com/b/mostlytrue/archive/2014/04/04/analyzing-azure-table-storage-data-with-hdinsight.aspx
The set up process in the blog is not very straight forward and it requires some extra config steps as there is no out of the box solution at present. Also, the ATS Hive connector is a community driven project, not a part of HDInsight right now which means
it is not officially supported if you run into any problems with it later on.
One easier alternative would be to get the data out of Azure Tables into CSV in your storage account and process it normally. A lot of folks who have built service telemetry pipelines are doing so.
Hope this helps.
DebarchanS - MSFT

Similar Messages

  • Export & Import internal table data across the programs

    Hi All,
    I have two different reports,let say ZPRO1 & ZPRO2. I want to export internal table data of ZPRO1  to  ZPRO2 and then I want to do some calculations in ZPRO2 with the imported internal table data(ZPRO1).
    If I use 'SUBMIT' or CALL TRANSACTION syntax ZPRO1 is displaying..but I want to use only internal table data of ZPRO1.
    Pls advise.
    Pranitha.

    Hi,
      Please follow the simple code and try to solve your issue.
    Code in program1
    types: begin of ty_itab,
           matnr type matnr,
           end of ty_itab.
    data: it_itab type table of ty_itab,
          wa_itab type ty_itab,
          it_itab1 type table of ty_itab.
    select matnr from mara into table it_itab.
    export it_itab to shared buffer indx(st) id 'ABC'.
    Code in program2
    types: begin of ty_itab,
           matnr type matnr,
           end of ty_itab.
    data: it_itab type table of ty_itab,
          wa_itab type ty_itab.
    import it_itab from shared buffer indx(st) id 'ABC'.
    loop at it_itab into wa_itab.
    write:/ wa_itab-matnr.
    endloop.
      Please delete shared buffer indx(st) id 'ABC', once we don't need the internal table data.
    Regards
    Dande

  • Importing internal table data from FM to eCATT test script

    Hi all,
    I am working on Workflow project where by i need to post invoices related to purchase orders sent as scanned imaages. Incase all the data in the incoming invoice(scanned) is correct then i am using BAPI_INCOMING_INVOICE_CREATE to post the invoice.
    But when the data is wrong, i need to give the user MIRO transaction pre-populated with the invoice details so that he manually corrects them(after verification) and posts the invoice.
    I understand MIRO is an ENJOY transaction hence BDC doesn't work properly(i have tried with all kinds of options on recording and the problem comes incase of multiple POs) hence i choose to use eCATT.
    The recording worksfine but my real problem is the PO data (multiple PO details for same invoice) is in table form. In all my previous objects related to eCATT we used to upload the file, hence no problem of importing tables.
    But now as i'll call this test script from a function module which has all the POs in the form of internal table, i need to import this into my test script.
    Please any body tell me how to proceed in this case or please suggest me a better a way to record the MIRO.
    Thank you,
    Lakshmi Narayana.S

    Hello Raj,
    First of all thank you for your inputs.
    Yes, I know about Parking an invoice. even i am using it in some scenarios.
    If everything on the invoice matches with the values in SAP then i am posting it using BAPI invoice_create also.
    But for some scenarios like if the PO on the invoice doesnot match with actual PO then i need to create a work item in the Accounts Payable user's inbox and he will change invoice data manually (after getting confirmation and correction) and post it using MIRO.
    Here i need to pre-populate the MIRO with the invoice data so that he can correct and post it.
    Hence i was trying to write a BDC and as MIRO is an Enjoy transaction i thought writing a BDC is not a good idea (i tried writing BDC but had problems) hence i am using eCATT.
    Hope i am clear about my problem. Any inputs to solve this are welcome.
    Thanks,
    Lakshmi Narayana.S

  • Importing partitioned table data into non-partitioned table

    Hi Friends,
    SOURCE SERVER
    OS:Linux
    Database Version:10.2.0.2.0
    i have exported one partition of my partitioned table like below..
    expdp system/manager DIRECTORY=DIR4 DUMPFILE=mapping.dmp LOGFILE=mapping_exp.log TABLES=MAPPING.MAPPING:DATASET_NAPTARGET SERVER
    OS:Linux
    Database Version:10.2.0.4.0
    Now when i am importing into another server i am getting below error
    Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 17 January, 2012 11:22:32
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "MAPPING"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
    Starting "MAPPING"."SYS_IMPORT_FULL_01":  MAPPING/******** DIRECTORY=DIR3 DUMPFILE=mapping.dmp LOGFILE=mapping_imp.log TABLE_EXISTS_ACTION=APPEND
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39083: Object type TABLE failed to create with error:
    ORA-00959: tablespace 'MAPPING_ABC' does not exist
    Failing sql is:
    CREATE TABLE "MAPPING"."MAPPING" ("SAP_ID" NUMBER(38,0) NOT NULL ENABLE, "TG_ID" NUMBER(38,0) NOT NULL ENABLE, "TT_ID" NUMBER(38,0) NOT NULL ENABLE, "PARENT_CT_ID" NUMBER(38,0), "MAPPINGTIME" TIMESTAMP (6) WITH TIME ZONE NOT NULL ENABLE, "CLASS" NUMBER(38,0) NOT NULL ENABLE, "TYPE" NUMBER(38,0) NOT NULL ENABLE, "ID" NUMBER(38,0) NOT NULL ENABLE, "UREID"
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_TG_ID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."PK_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_UREID" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_V2" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."IDX_PARENT_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."CKC_SMAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"MAPPING"."PK_MAPPING_ITM" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
    Processing object type TABLE_EXPORT/TABLE/COMMENT
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_MAPPING" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_MAPPING_CT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TG" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type REF_CONSTRAINT:"MAPPING"."FK_TT" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_PART" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_TIME_T" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_DAY" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    ORA-39112: Dependent object type INDEX:"MAPPING"."X_BTMP" skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_TG_ID" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_V2_T" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."PK_MAPPING" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_PARENT_CT" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"MAPPING"."IDX_UREID" creation failed
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"MAPPING"."MAPPING" creation failed
    Job "MAPPING"."SYS_IMPORT_FULL_01" completed with 52 error(s) at 11:22:39Please help..!!
    Regards
    Umesh Gupta

    yes, i have tried that option as well.
    but when i write one tablespace name in REMAP_TABLESPACE clause, it gives error for second one.. n if i include 1st and 2nd tablespace it will give error for 3rd one..
    one option, what i know write all tablespace name in REMAP_TABLESPACE, but that too lengthy process..is there any other way possible????
    Regards
    UmeshAFAIK the option you have is what i recommend you ... through it is lengthy :-(
    Wait for some EXPERT and GURU's review on this issue .........
    Good luck ....
    --neeraj                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Export and import a table data

    Hi All,
    There is table abc which is having appprox 40lak records without partitions(and have dmp of this data). Due to space constraints I want to compress the data and put it in archive schema.
    My question is:
    1) Can we import the data into other schema(say: archive schema) in compressed mode. If so could you please let me know the syntax for that.
    2) Can we import the data into some temporary table(say: temp) in same schema?
    3) And if any blob column exists in table then can we import the data without this blob column data.(Supposing the export dump has the blob data).
    Is it possible, or whould we export without this blob column and import this to other table?
    Please help me out.
    Edited by: user11942774 on 4 Dec, 2011 10:33 PM

    Thanks Mahir,
    Could you please let me know the syntax or with any examples if possible.
    is it possible to drop any column say blob column after the table is in compressed mode?
    Edited by: user11942774 on 4 Dec, 2011 10:53 PM
    Edited by: user11942774 on 4 Dec, 2011 10:54 PM

  • Import of Table data only from Full export Dump

    I want to import data inot specified tables of a databse only. Is it possible to do this from a dump file which is the full export of the another database. The database i want to import into is the mirror of the database whose export dump i want to use. Let me know if this is possible and if yes then how i could do this.
    Thanking in advance.

    Please check the following link for full instructions:
    http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/part1.htm#435787
    If on another release, the same documentation is available for it on OTN as well.
    Thanks!

  • How to Export/Import HANA table data.

    Hello,
    I try EXPORT HANA table.
    I try to export a table by using the HANA Studio on the client PC.
    but does not work.
    the following steps:
    HANA Studio
    <table> > Export > Select tables for export > Next >
    Error message : "Choose a location"
    There is the question
    Export possible? HANA database table by using client PC(HANA Studio).
    Best regards,
    Hosoya

    Hi Hosoya,
    Which version of the HANA server/studio you are using? I checked in Revision 24 and was able to do the table export as follows:
    Quick Launch>Export>SAP HANA Studio>Tables>'Select the server'>'Select the table'>Export table options
    In Export Table options screen we can specify either the Server Location as 'Export Tables on Server' or to the client location
    as 'Export tables to current client'
    Note: The server and studio I use are on different servers.
    Regards, Rahul

  • Migrating Azure table storage content to On prem SQl database

    hi 
    is it possible to import azure table storage data to On prem Sql database ?

    Hi
    You cannot do it directly from SQL or Azure Storage Explorer.
    But you can have a little application that extracts your table storage data into some CSV file like this:
    http://blogs.msdn.com/b/jmstall/archive/2012/08/03/converting-between-azure-tables-and-csv.aspx
    And then import the CSV file that you generate into your on-prem sql database
    Regards
    Aram
    Aram Koukia

  • Regarding Exporting and Importing internal table

    Hello Experts,
    I have two programs:
    1) Main program: It create batch jobs through open_job,submit and close job.Giving sub program as SUBMIT.
    I am using Export IT to memory id 'MID' to export internal table data to sap memory in the subprogram.
    The data will be processed in the subprogram and exporting data to sap memory.I need this data in the main program(And using import to get the data,but it is not working).
    Importing IT1 from memory id 'MID' to import the table data in the main program after completing the job(SUBMIT SUBPROGRAM AND RETURN).
    Importing is not getting data to internal table.
    Can you please suggest something to solve this issue.
    Thank you.
    Regards,
    Anand.

    Hi,
    This is the code i am using.
    DO g_f_packets TIMES.
    * Start Immediately
           IF NOT p_imm IS INITIAL .
             g_flg_start = 'X'.
           ENDIF.
           g_f_jobname = 'KZDO_INHERIT'.
           g_f_jobno = g_f_jobno + '001'.
           CONCATENATE g_f_jobname g_f_strtdate g_f_jobno INTO g_f_jobname
                                                  SEPARATED BY '_'.
           CONDENSE g_f_jobname NO-GAPS.
           p_psize1 = p_psize1 + p_psize.
           p_psize2 = p_psize1 - p_psize + 1.
           IF p_psize2 IS INITIAL.
             p_psize2  = 1.
           ENDIF.
           g_f_spname = 'MID'.
           g_f_spid = g_f_spid + '001'.
           CONDENSE g_f_spid NO-GAPS.
           CONCATENATE g_f_spname  g_f_spid INTO g_f_spname.
           CONDENSE g_f_spname NO-GAPS.
    * ... (1) Job creating...
           CALL FUNCTION 'JOB_OPEN'
             EXPORTING
               jobname          = g_f_jobname
             IMPORTING
               jobcount         = g_f_jobcount
             EXCEPTIONS
               cant_create_job  = 1
               invalid_job_data = 2
               jobname_missing  = 3
               OTHERS           = 4.
           IF sy-subrc <> 0.
             MESSAGE e469(9j) WITH g_f_jobname.
           ENDIF.
    * (2)Report start under job name
           SUBMIT (g_c_prog_kzdo)
                  WITH p_lgreg EQ p_lgreg
                  WITH s_grvsy IN s_grvsy
                  WITH s_prvsy IN s_prvsy
                  WITH s_prdat IN s_prdat
                  WITH s_datab IN s_datab
                  WITH p1      EQ p1
                  WITH p3      EQ p3
                  WITH p4      EQ p4
                  WITH p_mailid EQ g_f_mailid
                  WITH p_psize EQ p_psize
                  WITH p_psize1 EQ p_psize1
                  WITH p_psize2 EQ p_psize2
                  WITH spid     EQ g_f_spid
                  TO SAP-SPOOL WITHOUT SPOOL DYNPRO
                  VIA JOB g_f_jobname NUMBER g_f_jobcount AND RETURN.
    *(3)Job closed when starts Immediately
           IF NOT p_imm IS INITIAL.
             IF sy-index LE g_f_nojob.
               CALL FUNCTION 'JOB_CLOSE'
                 EXPORTING
                   jobcount             = g_f_jobcount
                   jobname              = g_f_jobname
                   strtimmed            = g_flg_start
                 EXCEPTIONS
                   cant_start_immediate = 1
                   invalid_startdate    = 2
                   jobname_missing      = 3
                   job_close_failed     = 4
                   job_nosteps          = 5
                   job_notex            = 6
                   lock_failed          = 7
                   OTHERS               = 8.
               gs_jobsts-jobcount = g_f_jobcount.
               gs_jobsts-jobname  = g_f_jobname.
               gs_jobsts-spname   = g_f_spname.
               APPEND gs_jobsts to gt_jobsts.
             ELSEIF sy-index GT g_f_nojob.
               CLEAR g_f_flg.
               DO.                         " Wiating untill any job completion
                 LOOP AT gt_jobsts into gs_jobsts.
                   CLEAR g_f_status.
                   CALL FUNCTION 'BP_JOB_STATUS_GET'
                     EXPORTING
                       JOBCOUNT                         = gs_jobsts-jobcount
                       JOBNAME                          = gs_jobsts-jobname
                    IMPORTING
                       STATUS                           = g_f_status
    *            HAS_CHILD                        =
    *          EXCEPTIONS
    *            JOB_DOESNT_EXIST                 = 1
    *            UNKNOWN_ERROR                    = 2
    *            PARENT_CHILD_INCONSISTENCY       = 3
    *            OTHERS                           = 4
                   g_f_mid = gs_jobsts-spname.
                   IF g_f_status = 'F'.
                     IMPORT gt_final FROM MEMORY ID g_f_mid .
                     FREE MEMORY ID gs_jobsts-spname.
                     APPEND LINES OF gt_final to gt_final1.
                     REFRESH gt_prlist.
                     CALL FUNCTION 'JOB_CLOSE'
                       EXPORTING
                         jobcount             = g_f_jobcount
                         jobname              = g_f_jobname
                         strtimmed            = g_flg_start
                       EXCEPTIONS
                         cant_start_immediate = 1
                         invalid_startdate    = 2
                         jobname_missing      = 3
                         job_close_failed     = 4
                         job_nosteps          = 5
                         job_notex            = 6
                         lock_failed          = 7
                         OTHERS               = 8.
                     IF sy-subrc = 0.
                       g_f_flg = 'X'.
                       gs_jobsts1-jobcount = g_f_jobcount.
                       gs_jobsts1-jobname  = g_f_jobname.
                       gs_jobsts1-spname   = g_f_spname.
                       APPEND gs_jobsts1 TO gt_jobsts.
                       DELETE TABLE gt_jobsts FROM gs_jobsts.
                       EXIT.
                     ENDIF.
                   ENDIF.
                 ENDLOOP.
                 IF g_f_flg = 'X'.
                   CLEAR g_f_flg.
                   EXIT.
                 ENDIF.
               ENDDO.
             ENDIF.
           ENDIF.
           IF sy-subrc <> 0.
             MESSAGE e539(scpr) WITH g_f_jobname.
           ENDIF.
           COMMIT WORK .
         ENDDO.

  • Exporting Table data to delimited txt file

    I am trying to import the table data into delimited text file, I am running the following code from Tom Kyte's website on the server. When I run the procedure after I run the function it is not creating any .dat file for me.
    I have also checked for the parameter 'utl_file_dir' in the database and it is set to the correct path. Is there anything that I am missing??
    any suggestions/inputs would help.
    create or replace function dump_csv( p_query in varchar2,
    p_separator in varchar2
    default ',',
    p_dir in varchar2 ,
    p_filename in varchar2 )
    return number
    AUTHID CURRENT_USER
    is
    l_output utl_file.file_type;
    l_theCursor integer default dbms_sql.open_cursor;
    l_columnValue varchar2(2000);
    l_status integer;
    l_colCnt number default 0;
    l_separator varchar2(10) default '';
    l_cnt number default 0;
    begin
    l_output := utl_file.fopen( p_dir, p_filename, 'w' );
    dbms_sql.parse( l_theCursor, p_query, dbms_sql.native );
    for i in 1 .. 255 loop
    begin
    dbms_sql.define_column( l_theCursor, i,
    l_columnValue, 2000 );
    l_colCnt := i;
    exception
    when others then
    if ( sqlcode = -1007 ) then exit;
    else
    raise;
    end if;
    end;
    end loop;
    dbms_sql.define_column( l_theCursor, 1, l_columnValue,
    2000 );
    l_status := dbms_sql.execute(l_theCursor);
    loop
    exit when ( dbms_sql.fetch_rows(l_theCursor) <= 0 );
    l_separator := '';
    for i in 1 .. l_colCnt loop
    dbms_sql.column_value( l_theCursor, i,
    l_columnValue );
    utl_file.put( l_output, l_separator ||
    l_columnValue );
    l_separator := p_separator;
    end loop;
    utl_file.new_line( l_output );
    l_cnt := l_cnt+1;
    end loop;
    dbms_sql.close_cursor(l_theCursor);
    utl_file.fclose( l_output );
    return l_cnt;
    end dump_csv;
    create or replace procedure test_dump_csv
    as
    l_rows number;
    begin
    l_rows := dump_csv( 'select *
    from doctype
    where rownum < 25',
    ',', '/tmp', 'test.dat' );
    end;
    /

    Hi,
    It works for me...
    SQL> select dump_csv('select * from dba_users',',','/tmp','teste.txt') from dual;
    DUMP_CSV('SELECT*FROMDBA_USERS',',','/TMP','TESTE.TXT')
                                                       149
    SQL> show parameter utl_file_dir
    NAME                                 TYPE        VALUE
    utl_file_dir                         string      /tmp
    oracle@icaro:/tmp> ls -l teste.txt
    -rw-r--r--  1 rps users 394353 2006-07-31 17:40 teste.txtJust for information:
    I'm using SUSE LINUX 10 and Oracle 10g 10.1.0.2
    Cheers
    Message was edited by:
    Legatti

  • JDBC Lookup - Import table data from a different schema in same DB

    Hi XI Experts,
    We are facing an issue while importing a Database table into the external definition in PI 7.1.
    The details are as below:
    I have configured user 'A' in PI communication channel to access the database. But the table that I want to access is present in schema "B". Due to this, I am unable to view the table that I have to import in the list available.
    In other words, I am trying to access a table present in a different schema in the same database. Please note that my user has been given all the required permissions to access different schema. Even then, I am unable to access the table in different schema.
    Kindly provide your valuable suggestions as to how I can import table which is present in another schema but in the same Database.
    Regards,
    Subbu

    If you are using PI 7.1, then you can do JDBC Lookup to import JDBC meta data (table structures from DB). Configure a jdbc receiver communication channel where you specify username and password which has permission to access schema A and Schema B of database. Specify database name in the connection string. Then you might have access to import both schema.
    Please refer these links
    SAP PI 7.1 Mapping Enhancements Series: Graphical Support for JDBC and RFC Lookups
    How to use JDBC Lookup in PI 7.1 ?

  • Excel issues with importing CSV or HTML table data from URL - Sharepoint? Office365?

    Greetings,
    We have a client who is having issues importing CSV or HTML table data as one would do using Excel's Web Query import from a reporting application.  As the error message provided by Excel is unhelpful I'm reaching out to anyone who can help us begin to
    troubleshoot problems affecting what is normal standard Excel functionality.  I'd attach the error screenshot, but I can't because my account is not verified....needless to say it says "Microsoft Excel cannot access  the file https://www.avantalytics.com/reporting_handler?func=wquery&format=csv&logid=XXXX&key=MD5
    Where XXXX is a number and MD5 is an md5 code.  The symptoms stated in the error message are:
    - the file name or path does not exist
    -The file is being used by another program
    -The workbook you are trying to save has the same name as a currently open workbook.
    None of these symptoms are the case, naturally. The user encountered this with Excel2010, she was then upgraded to Excel2013 and is still experiencing the same issue. The output of this URL in a browser (IE, Chrome, Firefox) is CSV data for the affected
    user, so it is not a network connectivity issue.  In our testing environment using both Excel2010 or 2013 this file is imported successfully, so we cannot replicate.  The main difference I can determine between our test environment and the end-user
    is they have a Sharepoint installation and appear to have Office365 as well.
    So,  my question might more appropriately be for Sharepoint or Office365 folks, but I can't be sure they're  a culprit.  Given this - does anyone have any knowledge of issues which might cause this with Sharepoint or Office365 integrated with
    Excel and/or have suggestions for getting more information from Excel or Windows other than this error message?  I've added the domain name as a trusted publisher in IE as I thought that might be the issue, but that hasn't solved anything.  As you
    can see its already https and there is no authentication or login - the md5 key is the authentication.  The certificate for the application endpoint is valid and registered via GoDaddy CA.
    I'm at a loss and would love some suggestions on things to check/try.
    Thanks  -Ross

    Hi Ross,
    >> In our testing environment using both Excel 2010 and 2013 this file is imported successfully, so we cannot replicate.
    I suspect it is caused by the difference of web server security settings.
    KB: Error message when you use Web query to a secure Web page (HTTPS://) in Excel: "Unable to open"
    Hope it will help.
    By the way, this forum is mainly for discussing questions about Office Development (VSTO, VBA and Apps for Office .etc.). For Office products feature specific questions, you could consider posting them on
    Office IT Pro forum or Microsoft Office Community.
    Regards,
    Jeffrey
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • To export and import oracle 11g table data only

    Hi Gurus,
    Just not sure of the procedure to follow in the export just the table data and then truncate the table do some changes(not table structure changes ) and then import the same table data in to the relevent table .
    Could some please help me in the setps involved in it .
    Thanks a Lot in advance

    If you can use Data Pump, here are your commands:
    expdp table_owner/password directory=<your_directory> dumpfile=table_name.dmp tables=table_name content=data_only
    impdp table_owner/password directory=<your_directory> dumpfile=table_name.dmp tables=table_name table_exists_action=append
    Data Pump requires version 10.1 or later.
    Dean

  • Import table data difficulties

    I have been sucessfully importing tabular data into
    Dreamweaver MX for some time. I have recently changed the structure
    of the page I import to, but still want to import the same data to
    the page. However, now when i do it, the imported data will not go
    to the correct place. Usually, I highlight last week's data, then
    import from a tab delimited file and the new data takes the place
    of the old, in the correct place. Now however, it is being copied
    higher up the page than I want.
    The changes I made were to include some php include
    statements, but this shouldn't affect the way my data is being
    imported, I don't think. Previously, I would do this all in design
    mode, but with php not being visible, I can't see the items I want
    in this view, so have had to change to code view.
    Any help would be appreciated.

    On Mon, 10 Sep 2007 22:36:31 +0000 (UTC), "dznutz7"
    <[email protected]> wrote:
    >Hey I'm having the same problem. Not with php but I had a
    table that I was
    >just trying to update. Now when I import the tab
    delimited file it keeps going
    >to the wrong place on the page & I can't move it at
    all. Hopefully someone
    >knows how to fix this.
    Import into new blank page;
    Copy/paste into desired page.

  • Writing a stored procedure to import SQL Server table data into a Oracle table

    Hello,
    As a new DBA I have been tasked with writing a stored procedure to import SQL Server table data into an Oracle table. I have been given many suggestions on how to do it from SQL Server but I I just need to write a stored procedure to run it from the Oracle side. Suggestions/guidance on where to start would be greatly appreciated! Thank you!
    I started to write it based on what I have but I know this is not correct :/
    # Here is the select statement for the data source in SQL Server...
    SELECT COMPANY
    ,CUSTOMER
    ,TRANS_TYPE
    ,INVOICE
    ,TRANS_DATE
    ,STATUS
    ,TRAN_AMT
    ,CREDIT_AMT
    ,APPLD_AMT
    ,ADJ_AMT
    ,TRANS_USER1
    ,PROCESS_LEVEL
    ,DESCRIPTION
    ,DUE_DATE
    ,OUR_DATE
    ,OUR_TIME
    ,PROCESS_FLAG
    ,ERROR_DESCRIPTION
      FROM data_source_table_name
    #It loads data into the table in Oracle....   
    Insert into oracle_destination_table_name (
    COMPANY,
    CUSTOMER,
    TRANS_TYPE,
    INVOICE,
    TRANS_DATE,
    STATUS,
    TRANS_AMT,
    CREDIT_AMT,
    APPLD_AMT,
    ADJ_AMT,
    TRANS_USER1,
    PROCESS_LEVEL,
    DESCRIPTION,
    DUE_DATE,
    OUR_DATE,
    OUR_TIME,
    PROCESS_FLAG,
    ERROR_DESCRIPTION)
    END;

    CREATE TABLE statements would have been better as MS-SQL and Oracle don't have the same data types.
    OUR_DATE, OUR_TIME will (most likely) be ONE column in Oracle.
    DATABASE LINK
    Personally, I'd just load the data over a database link:
    insert into oracle_destination_table_name ( <column list> )
    select ... <transform data here>
    from data_source_table@mssql_db_link
    As far as creating the database link from Oracle to MS-SQL ... that is for somebody else to answer.
    (most likely you'll need to use an ODBC driver)
    EXTERNAL TABLE
    If the data from MS-SQL is in a CSV file, just use and external table.
    same concept:
    insert into oracle_destination_table_name ( <column list> )
    select ... <transform data here>
    from data_source_external_table
    MK

Maybe you are looking for

  • Security service error in OBIEE 11G LDAP configuration

    Hello I've recently set up some OBIEE 11G installations and they appear to work ok. I've more recently been using various guides on the internet to configure OBIEE 11G and Active Directory and can see the users and groups within Weblogic that belong

  • Droid 3 touch screen unresponsive

    I was wondering if there is an alternate way of removing an app other than using the android market. My droid 3's touch screen is unresponsive and I know that the app Go Locker is causing the problem.

  • ABAP program generation after tp import

    After a transport was imported successfully into production the program screen did not look the same as in non-production. To correct - the code was regenerated manually. I thought the tp import did this for you? Whan I look at the log I see: Generat

  • Updating MySQL using Spry

    I greatly enjoy using the Spry framework for the display of information, however, would like to use it full circle. Can anyone point me in the direction of updating a mysql table with information from a form when say for instance someone selects some

  • Including JavaScript in Custom Components

    What's the recommended way to include large amounts of javascript in a custom component? I'm working on a calendar component and have the javascript being output in a large writer.write(...), but that writer is outputing about 2,500 lines of javascri