Export and Import of Access DB tables in Apex

Hi,
i need help.
I wanna export an MSACCESS DB Table into an ORACLE DB 10g2.
but every time i get an error course my ODBC driver which isnt compatible to my tns service name.
what could i do?
i only want to export a table..not the whole database.
if i want to export and import a Table over a XE Version, there are no errors and i could see the table in APEX.
...thx for help :)

Hi Timo,
If you open your table in Access, and click the "select all" (the empty grey box at the left of the header row), you can copy the entire table and paste it into Apex as though you were copying from Excel - the structure of the copied data is the same. The only thing is that you will be limited to 32,000 characters.
Obviously, though, getting hold of, and setting up, the correct ODBC drivers is the more proper solution but the above may solve your immediate problem.
Regards
Andy

Similar Messages

  • Export and import of data not table and data ????

    hii brothers and sister
    plz i have a a quetion about Export and import of data in oracle forms
    i have created 02 boutons one for export his trigger like this :
    eclare
    alrt number;
    v_directory varchar2(200) := 'c:\backup'; --- that if the C Drive not the Drive that the windows had installed in it.
    path varchar2(100):='back_up'
    ||to_char(sysdate,'dd_mm_yyyy-hh24_mi_ss');
    v_exp varchar2(200) := 'exp hamada/hamada2013@orcl file = '
    ||v_directory
    ||'\'
    ||path
    ||'.dmp';
    begin
    host(v_exp);
    alrt:=show_alert('MSG');
    end;
    and the bouton import is like this
    declare
    alrt number ;
    v_ixp varchar2(200) := 'imp userid=hamada/hamada2013@orcl file =c:\backup2\back.dmp full=yes';
    begin
    host(v_ixp);
    alrt:=show_alert('MSG');
    ref_list;
    end;
    i have just one table "phone"
    this code is correct he expot not only the data but also the creation of the table ....for exemple i do export and evry thing is good until now and i find the .dmp in the folder backup .. but when i deleted all data from my app and try to import this .dmp iit show me error it tell me thet the table phone is already created ....
    so plz help wanna just export the data of phone not the creation of table and data ???? or how can i import just the data from this .dmp ??

    Pl post OS and database versions.
    You will likely need to use the IGNORE flag - http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_import.htm#sthref2201
    Imp utility (by default) will try and create the table first. Since the table already exists, imp reports an error. Use the IGNORE flag in your imp command to not report such errors.
    HTH
    Srini

  • Export and import of 3 Gb table

    Dear ,
    I have table with more than 3 million records, i need to export this table and import into a different database . can you please help me with the export and import options options so that i can speed up the process...
    for you information
    this table hase 3 Fk, 2 Indexs
    i would appreciate your help in this regard
    thanks

    One way to speed up the exp/imp process is to use DIRECT=Y option. Also increase the size of BUFFER parameter.
    While importing large amount of data it is better to drop relational constraints and indexes and recreate them after import.

  • How to export and import only data not table structure

    Hi Guys,
    I am not much aware about import ,export utility please help me ..
    I have two schema .. Schema1, Schema2
    i used to use Schema1 in that my valuable data is present . now i want to move this data from Schema1 to Schema2 ..
    In schema2 , i have only table structure , not any data ..

    user1118517 wrote:
    Hi Guys,
    I am not much aware about import ,export utility please help me ..
    I have two schema .. Schema1, Schema2
    i used to use Schema1 in that my valuable data is present . now i want to move this data from Schema1 to Schema2 ..
    In schema2 , i have only table structure , not any data ..Nothing wrong with exporting the structure. Just use 'ignore=y' on the import. When it tries to do the CREATE TABLE it (the CREATE statement) will fail because the table already exists, but the ignore=y means "ignore failures of CREATE", and it will then proceed to INSERT the data.

  • Export and import problems

    i have 2 major problems importing my application from htmldb.oracle.com to my serevr at home.
    1.
    i export my application and thems from htmldb.oracle.com.
    when i opened the sql export file i saw that all hebrew chracters was "jibrish".
    how can i export my application with hebrew fonts ?
    2. i tried to export and import xml data (my tables data).
    the export xml file showing me the hewbrew data but when i import it to my tables it gets ????????? instead of hebrew.
    i think it's the nls lang but i don't know what nls land i need and where to change it
    (database ,enviroment....)
    can someone help me with this problems ?
    thanks
    p.s
    my db is 10.1.0.3
    red hat linux as 4 on vmware 4.5
    my nls parameters:
    nls_language: american
    nls_territory: america
    nls_characterset: we8iso8859p1
    nls_nchar_charcterset: al16utf16
    thanks

    Zvika,
    I have two options that may work for you. First, define the fields that have Hebrew characters using the CHAR qualifier. Example, varchar2(30 char). That allows you to store double byte characters from other languages, and overrides the nls_length_semantics (byte default) parameter.
    Or, you could set the nls_length_semantics to default to char. It definitely uses more space when the char qualifier is used, so it's probably best to override the byte default only when necessary.
    Hope that helps!
    Also, I posted the 'Blogging Tool for HTMLDB' question with the username 'aufan89', which I had to change because my email changed.
    So my new username is now 'aufan1989'. I look forward to your email when you get your app fixed and posted onto HTMLDB Studio!
    Thanks!

  • Exporting and importing table using R3trans program between 2 clients

    Hi,
    How to export and import a table between to clients in a same system using R3trans program?
    I need to copy a table from Client 020 in a system to client 040 of the same system using R3 trans. I need to know the procedure.
    Can any one advice
    Regards,
    Suresh

    This is how you do a export and import of table entries.
    Export:
    Open Notepad and type the following,
    export
    client = 020
    file = 'clone.export.<sid>.<client no>.data'
    select * from <client_dependent_tablename1>
    select * from <client_dependent_tablename2>
    select * from <client_dependent_tablenamen>
    Save the file as export.ctl
    Run R3trans export.ctl
    and the data of these files will be stored in a file called clone.export.<sid>.data in the directory from which you have called R3trans
    Import:
    Open Notepad,
    import
    client = 040
    file = clone.export.<sid>.<client no>.data
    buffersync = yes                                               
    Save the file as import.ctl
    Run R3trans import.ctl
    Cheers!
    Bidwan
    Message was edited by:
            Bidwan Baruah

  • Exporting and importing just table definitions

    Hi,
    I have this production database that has a huge amount of data in it. I was asked to set up a test database based on the exact same schema as the live database. When I tried to do an export (from live) and import (to test), with the parameters rows=N and compress=y, the target (test database) data file will still grow enormously, presumably because of the huge number of extents already allocated to the table in the live database. My test database of course, has a limited hard-disk space.
    Is there a way to export and import the table definitions without having the target database experiencing a huge growth in the size of the tablespace?
    Thanks,
    Chris.

    If an export with compress=n is still creating initial extents that a too large, you can still build with the import file but it will take a little work.
    run imp with indexfile=somefile.sql
    when imp is finished, edit somefile.sql by:
    1. remove all the REM statements.
    2. remove all the storage clauses (tables and indexes)
    Make sure your tablespaces have a small (say 1k) default initial extent.
    run imp again with rows=n
    All your tables and indexes will be created with the default tablespace initial extent.

  • How to export and import dependent tables from 2 different schema

    I have a setup where schema1 has table 1 and schema 2 has table2. And table 1 from schema1 depends upon table 2 from schema 2.
    I would like to export and import these tables only and not any other tables from these 2 schemas with all information like grants,constraints.
    Also will there be same method for Oracle 10g R1,R2 and Oracle 11G.
    http://download.oracle.com/docs/cd/B12037_01/server.101/b10825/dp_export.htm#i1007514
    Looking at this For table mode it says
    Also, as in schema exports, cross-schema references are not exported
    Not sure what this means.
    As I am interested in only 2 tables I think I need to use table mode. But if I try to run export with both tables names, it says table mode support only one schema at a time. Not sure then How would the constraints would get exported in that case.
    -Rohit

    worked for my 1st time I tried
    exp file=table2.dmp tables="dbadmin.temp1,scott.emp"
    Export: Release 10.2.0.1.0 - Production on Mon Mar 1 16:32:07 2010
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Username: / as sysdba
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Export done in US7ASCII character set and AL16UTF16 NCHAR character set
    server uses WE8ISO8859P1 character set (possible charset conversion)
    About to export specified tables via Conventional Path ...
    Current user changed to DBADMIN
    . . exporting table                          TEMP1         10 rows exported
    EXP-00091: Exporting questionable statistics.
    Current user changed to SCOTT
    . . exporting table                            EMP         14 rows exported
    EXP-00091: Exporting questionable statistics.
    EXP-00091: Exporting questionable statistics.
    Export terminated successfully with warnings.

  • Internal table export and import in ECC 5.0 version

    Hi friends,
    I am trying to export and import internal table from one program to other program.
    The below… export and import commands are not working when I run the program in background (using SUBMIT zxxxx via JOB name NUMBER number…..)
    EXPORT ITAB TO MEMORY id 'ZMATERIAL_CREATE'.
    IMPORT ItAB FROM MEMORY ID 'ZMATERIAL_CREATE'.
    Normally it should work. Since It’s not working I am trying with another alternative..
    i.e EXPORT (ptab) INTERNAL TABLE itab.
    My sap version is ECC 5.0….
    For your information, here I am forwarding sap help. Pls have a look and explain how to declare ptab internal table.
    +Extract from SAP help+
    In the dynamic case the parameter list is specified in an index table ptab with two columns. These columns can have any name and have to be of the type "character". In the first column of ptab, you have to specify the names of the parameters and in the second column the data objects. If the second column is initial, then the name of the parameter in the first column has to match the name of a data object. The data object is then stored under its name in the cluster. If the first column of ptab is initial, an uncatchable exception will be raised.
    Outside of classes you can also use a single-column internal table for parameter_list for the dynamic form. In doing so, all data objects are implicitly stored under their name in the data cluster.
    My internal table having around 45 columns.
    pls help me.
    Thanks in advance
    raghunath

    The export/import should work the way you are using it. Just make sure you are using same memory id and make sure its unique - meaning u are using it only for this itab purpose and not overwriting it with other values. Check itab is not initial before you export in program 1 - then import it in prog2 with same memory id...also check case, I am not sure if its case sensitive...
    Here is how you use the second variant...
    Two fields with two different identifications "P1" and "P2" with the dynamic variant of the cluster definition are written to the ABAP Memory. After execution of the statement IMPORT, the contents of the fields text1 and text2 are interchanged.
    TYPES:
      BEGIN OF tab_type,
        para TYPE string,
        dobj TYPE string,
      END OF tab_type.
    DATA:
      id    TYPE c LENGTH 10 VALUE 'TEXTS',
      text1 TYPE string VALUE `IKE`,
      text2 TYPE string VALUE `TINA`,
      line  TYPE tab_type,
      itab  TYPE STANDARD TABLE OF tab_type.
    line-para = 'P1'.
    line-dobj = 'TEXT1'.
    APPEND line TO itab.
    line-para = 'P2'.
    line-dobj = 'TEXT2'.
    APPEND line TO itab.
    EXPORT (itab)     TO MEMORY ID id.
    IMPORT p1 = text2
           p2 = text1 FROM MEMORY ID id.

  • Shell Script For Export And Import Of Table Records

    Hello,
    We have production and test instances and for constant testing we need to copy data from production to test or development environment.
    At the moment what we do is manually doing export and import table records. At times this could be very tedious as we may need
    to do this exercise a couple of times in a day.
    Is it a good idea to do this exercise using shell script? If so how could I do this? If this is not a good idea what are the best alternatives?
    Any input is highly appreciated.
    Thanks

    Ah I see, your company prefers stupidity over efficiency. It would be possible to do it in a controlled environment, wouldn't it? Also the test database would be allowed to select only.
    So the non-allowance is just plain stupid.
    To the second question: do you use hard-coded passwords in shell scripts?
    Don't you think that poses a security risk?
    Don't you think that is a bigger risk than a database link, properly set up?
    In my book it is!
    Sybrand Bakker
    Senior Oracle DBA

  • Export and import of an table data

    Hi All,
    I have a situation where i need to append/import  the production table ( X )  data to test table ( X ) which has some data already in it and should not be lost  during the operation. The record count is around 6 million .
    Any expert suggestion or tips with export and import command is highly appriciable.
    Thanks in advance .

    if you are using
    IMPDP --- use table_exist_action=append
    IMP -- ignore=y
    Lets say
    impdp username/pwd directory=<directory_name> dumpfile=<dumpfile_name> tables=X table_exist_action=APPEND.
    This will append the data to existing table

  • Export and Import of table

    Dears,
    I need to export two tables from PRD and import to QAS.
    Size of table is 1 GB and 3 GB.
    Please confirm:
    1--Do i need any downtime for export and import of table
    2- How much time will it take in this export process.
    Please suggest.
    Shivam

    I cannot tell you how much time it will take. are you still using Pentium 3 processors? are you on 4 quad-core CPU? are you on IDE disk? SATA disks? on a fast and big powerful SAN?
    This being said, I would say roughly 10 minutes. but as you understand, this is just a wild guess.
    for downtime, you did not tell us the table names. This being said, try to export on a low usage period.

  • Export and import the internal table

    Hi,
    Could you explain the total internal table data how we export and import please give me example.
    Thanks,
    Hari

    Hi,
    Check this thread..
    exporting internal table to memory variable

  • Export and import XMLType table

    Hi ,
    I want to export one table which contain xmltype column form oracle 11.2.0.1.0 and import into 11.2.0.2.0 version.
    I got following errors when i export the table , when i tried with exp and imp utility
    EXP-00107: Feature (BINARY XML) of column ZZZZ in table XXXX.YYYY is not supported. The table will not be exported.
    then i tried export and import pump.Exporting pump is working ,following is the log
    Export: Release 11.2.0.1.0 - Production on Wed Oct 17 17:53:41 2012
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ;;; Legacy Mode Active due to the following parameters:
    ;;; Legacy Mode Parameter: "log=<xxxxx>Oct17.log" Location: Command Line, Replaced with: "logfile=T<xxxxx>_Oct17.log"
    ;;; Legacy Mode has set reuse_dumpfiles=true parameter.
    Starting "<xxxxx>"."SYS_EXPORT_TABLE_01": <xxxxx>/******** DUMPFILE=<xxxxx>Oct172.dmp TABLES=<xxxxx>.<xxxxx> logfile=<xxxxx>Oct17.log reusedumpfiles=true
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 13.23 GB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "<xxxxx>"."<xxxxx>" 13.38 GB 223955 rows
    Master table "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    Dump file set for <xxxxx>.SYS_EXPORT_TABLE_01 is:
    E:\ORACLEDB\ADMIN\LOCALORA11G\DPDUMP\<xxxxx>OCT172.DMP
    Job "<xxxxx>"."SYS_EXPORT_TABLE_01" successfully completed at 20:30:14
    h4. I got error when i import the pump using following command
    +impdp sys_dba/***** dumpfile=XYZ_OCT17_2.DMP logfile=import_vmdb_XYZ_Oct17_2.log FROMUSER=XXXX TOUSER=YYYY CONTENT=DATA_ONLY TRANSFORM=oid:n TABLE_EXISTS_ACTION=append;+
    error is :
    h3. KUP-11007: conversion error loading table "CC_DBA"."XXXX"*
    h3. ORA-01403: no data found*
    h3. ORA-31693: Table data object "XXX_DBA"."XXXX" failed to load/unload and is being skipped due to error:*
    Please help me to get solution for this.

    CREATE UNIQUE INDEX "XXXXX"."XXXX_XBRL_XMLINDEX_IX" ON "CCCC"."XXXX_XBRL" (EXTRACTVALUE(SYS_MAKEXML(128,"SYS_NC00014$"),'/xbrl'))above index is created by us because we are storing file which like
    <?xml version="1.0" encoding="UTF-8"?>
    <xbrl xmlns="http://www.xbrl.org/2003/instance" xmlns:AAAAAA="http://www.AAAAAA.COM/AAAAAA" xmlns:ddd4217="http://www.xbrl.org/2003/ddd4217" xmlns:link="http://www.xbrl.org/2003/linkbase" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <link:schemaRef xlink:href="http://www.fsm.AAAAAA.COM/AAAAAA-xbrl/v1/AAAAAA-taxonomy-2009-v2.11.xsd" xlink:type="simple" />
    <context id="Company_Current_ForPeriod"> ...I tried to export pump with and without using DATA_OPTIONS=XML_CLOBS too.Both time exporting was success but import get same KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL" error.
    I tried the import in different ways
    1. Create table first then import data_only (CONTENT=DATA_ONLY)
    2. Import all table
    In both way table ddl is created successfully ,but it fail on import data.Following is the log when i importing
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "aaaaa"."SYS_IMPORT_TABLE_02" successfully loaded/unloaded
    Starting "aaaaa"."SYS_IMPORT_TABLE_02":  aaaaa/********@vm_dba TABLES=tab_owner.bbbbb_XBRL dumpfile=bbbbb_XBRL_OCT17_2.DMP logfile=import_vmdb
    bbbbb_XBRL_Oct29.log DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    *KUP-11007: conversion error loading table "tab_owner"."bbbbb_XBRL"*
    *ORA-01403: no data found*
    ORA-31693: Table data object "tab_owner"."bbbbb_XBRL" failed to load/unload and is being skipped due to error:
    ORA-29913: error in executing ODCIEXTTABLEFETCH callout
    ORA-26062: Can not continue from previous errors.
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    Job "aaaaa"."SYS_IMPORT_TABLE_02" completed with 1 error(s) at 18:42:26

  • Export and import a table data

    Hi All,
    There is table abc which is having appprox 40lak records without partitions(and have dmp of this data). Due to space constraints I want to compress the data and put it in archive schema.
    My question is:
    1) Can we import the data into other schema(say: archive schema) in compressed mode. If so could you please let me know the syntax for that.
    2) Can we import the data into some temporary table(say: temp) in same schema?
    3) And if any blob column exists in table then can we import the data without this blob column data.(Supposing the export dump has the blob data).
    Is it possible, or whould we export without this blob column and import this to other table?
    Please help me out.
    Edited by: user11942774 on 4 Dec, 2011 10:33 PM

    Thanks Mahir,
    Could you please let me know the syntax or with any examples if possible.
    is it possible to drop any column say blob column after the table is in compressed mode?
    Edited by: user11942774 on 4 Dec, 2011 10:53 PM
    Edited by: user11942774 on 4 Dec, 2011 10:54 PM

Maybe you are looking for

  • Can I use a G4 to get raw video footage to a no-firewire aluminum MacBook?

    Hi - I recently purchased a new aluminum MacBook 2.4Ghz and was shocked to discover the lack of a firewire port. Ok, I should have read the specs more closely before ordering, but I never would have imagined this port being removed (other than from t

  • Movie not listed in Movies but is in Recently Added.

    I added a movie into my iTunes but it didn't show in the movies tab, but showed in the recently added. Any ideas?

  • Restrict the quantity fields from changing in ME54N during PR release

    Hi All, I have created a workflow for purchase requisition release strategy .  Apporver's will login into ME54N transcation from the attachment in email and provide Release/Reject on PR . At that when any approver login to ME54N the item detials are

  • Custom metadata trigger rule in Retention Management

    Hi Gurus, I am new to Oracle UCM and URM. I was able to configure Retention Management in Server but had couple of questions in using it. I am couple of custom metadata where in I would like to add that as part of Trigger condition. Say for instance

  • Application ADF load very slow

    Dear all. My team build a backup server for oracle portal and oracle center content version 11.1.1.6. I restored UCM server normal, configuration and index. But application load very slow, 2 minutes. In original server is 15 second. I see log when i