Importing data tables into data tablespace and indexes into tablespaces

Hi
I want to import data into new schema and i want to store tables into data tablespaces and index into index tablespace ...can anyone tell me how it will possible...

I want to import data into new schema and i want to store tables into data tablespaces and index into index tablespace ...can anyone tell me how it will possible...
imp userid=/user/passwd show=y indexfile=import.sql indexes=n full=y
imp userid=/user/passwd show=y indexfile=import2.sql full=y
Edit the import.sql and import2.sql to modify the tables' tablespace and indexes tablespace.
execute import.sql the script in the database. this will create the tables in their respective tablespace.
imp userid=/user/passwd full=y ignore=y indexes=n constraints=y - to import just the data since the tables have already been created.
imp userid=/user/passwd full=y ignore=y rows=n  - to import just the indexes since the tables and data have already been imported.

Similar Messages

  • How import data on one tablespace and indexes on another tablespace

    i have import dump from from database in oracle 10g as
    c:> imp userid=system/password full=y file=d:\ful.dmp log=d:\full.log
    Now i want import tables data on tablespace datatb and indexes on tablespace indextb. how i can do this job
    Thanks

    After importing the database you may move the indexes to other tablespace by rebuilding it.
    c:>sqlplus /nolog
    SQL> conn /as sysdba
    connected
    SQL> spool c:\indx_rbld.log
    SQL> select 'alter index '||owner||'.'||index_name||' rebuild online parallel tablespace <tablespace_name> nologging;' from dba_indexes where owner=<username>;
    SQL> spool off
    SQL> @c:\indx_rbld.log
    Hope following link will help you:
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:901906930328
    Message was edited by:
    Santosh Kumar

  • After move tables and index another tablespace, indexes got unusable?

    Hi,
    I moved table and indexes another tablespace. After move many indexes got unusable state. I couldn't understand thiis?
    How can I make them again usable?
    regards and thanks?

    hi,
    check this query below
    create table test (id number(10));
    create table succeeded.
    insert into test values (10);
    1 rows inserted
    commit;
    create index test_indx on test(id);
    select index_name, status from user_indexes
    where table_name = 'TEST';
    INDEX_NAME STATUS
    TEST_INDX VALID
    alter table test move tablespace users;
    select index_name, status from dba_indexes
    where table_name = 'TEST';
    INDEX_NAME STATUS
    TEST_INDX UNUSABLE
    rebuliding index
    alter index test_indx rebuild
    alter index test_indx succeeded.
    then check
    select index_name, status from user_indexes
    where table_name = 'TEST';
    INDEX_NAME STATUS
    TEST_INDX VALID

  • DSO New data table rejects data

    Dear SDNers,
    I have a critical issue.
    I am loading data into a DSO (which has end routine, lookup DSO of 2crore records in production)
    In development, it worked good.
    In Production it took very long time to load data. I initially thought it might be because of the lookup DSO. But it never loaded the data into new data table. The data load monitor results in yellow always.
    I used filter in DTP and loaded only limited data. But this also not loading data. The data load monitor results in yellow again.
    Then I found that i am unable to access the new data table's data browser (from manage).
    I checked with se11 also, I am able to see the table. But if I click contents, the system hangs.
    So what I observe is New data table is not allowing to post any new entries?
    Kindly give me some insight regarding solving this issue.
    Thanks,
    Guru

    Hi Prasanth,
    I didnt say as i am able to see data in change log. I said I am able to access the data browser of both Active and change log table.
    and @Saveen, I donot want to load data manually to new data table.
    To be Precise i ll answer pransanth questions here..
    What is the status of the load in the monitor screen. Yellow (still running)
    Is the load completed or not? what is the record count? No the load is not completing.
    Is the request active in the DSo and available for reporting? Request is active
    Are you facing this issue for the first time? Yes for the first time and only in this DSO.
    Are you sure you have authorizations to check the data through manage screen, try to run the Authorization trace on your id and check if the roles are there for your id or not?  Yes, I am able to see active data table, change log etcc. even new table of other DSOs. Only new table of this DSO is inaccessible from manage
    See, I used filter in DTP, so it ll bring only limited records from lookup DSO. So no issue with the performance deadlock.
    And again I simulated the load and saw, Result package gets filled up perfectly, So the code also works fine.
    but if I load the data itll be yellow status with no records added in new table.
    In this scenario, New table is inaccesible from manage, So I am pretty sure that both are interrelated issues.
    Can some one help me pls..
    Thanks,
    Guru

  • What are differences between the target tablespace and the source tablespac

    The IMPDP command create so manay errors. But the EXAMPLE tablespace is transported to the target database successfully. It seems that the transported tablespace is no difference with the source tablespace.
    Why create so many errors?
    How to avoid these errors?
    What are differences between the target tablespace and the source tablespace?
    Is this datapump action really successfull?
    Thw following is the log output:
    [oracle@hostp ~]$ impdp system/oracle dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
    Import: Release 10.2.0.1.0 - Production on Sunday, 28 September, 2008 18:08:31
    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SYSTEM"."SYS_IMPORT_TABLESPACE_01" successfully loaded/unloaded
    Starting "SYSTEM"."SYS_IMPORT_TABLESPACE_01": system/******** dumpfile=user_dir:demo02.dmp tablespaces=example remap_tablespace=example:example
    Processing object type TABLE_EXPORT/TABLE/TABLE
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    CREATE TABLE "OE"."CUSTOMERS" ("CUSTOMER_ID" NUMBER(6,0), "CUST_FIRST_NAME" VARCHAR2(20) CONSTRAINT "CUST_FNAME_NN" NOT NULL ENABLE, "CUST_LAST_NAME" VARCHAR2(20) CONSTRAINT "CUST_LNAME_NN" NOT NULL ENABLE, "CUST_ADDRESS" "OE"."CUST_ADDRESS_TYP" , "PHONE_NUMBERS" "OE"."PHONE_LIST_TYP" , "NLS_LANGUAGE" VARCHAR2(3), "NLS_TERRITORY" VARCHAR2(30), "CREDIT_LIMIT" NUMBER(9,2), "CUST_EMAIL" VARCHAR2(30), "ACCOUNT_MGR_ID" NU
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
    CREATE TABLE "IX"."ORDERS_QUEUETABLE" ("Q_NAME" VARCHAR2(30), "MSGID" RAW(16), "CORRID" VARCHAR2(128), "PRIORITY" NUMBER, "STATE" NUMBER, "DELAY" TIMESTAMP (6), "EXPIRATION" NUMBER, "TIME_MANAGER_INFO" TIMESTAMP (6), "LOCAL_ORDER_NO" NUMBER, "CHAIN_NO" NUMBER, "CSCN" NUMBER, "DSCN" NUMBER, "ENQ_TIME" TIMESTAMP (6), "ENQ_UID" VARCHAR2(30), "ENQ_TID" VARCHAR2(30), "DEQ_TIME" TIMESTAMP (6), "DEQ_UID" VARCHAR2(30), "DEQ_
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    . . imported "SH"."CUSTOMERS" 9.850 MB 55500 rows
    . . imported "SH"."SUPPLEMENTARY_DEMOGRAPHICS" 695.9 KB 4500 rows
    . . imported "OE"."PRODUCT_DESCRIPTIONS" 2.379 MB 8640 rows
    . . imported "SH"."SALES":"SALES_Q4_2001" 2.257 MB 69749 rows
    . . imported "SH"."SALES":"SALES_Q1_1999" 2.070 MB 64186 rows
    . . imported "SH"."SALES":"SALES_Q3_2001" 2.129 MB 65769 rows
    . . imported "SH"."SALES":"SALES_Q1_2000" 2.011 MB 62197 rows
    . . imported "SH"."SALES":"SALES_Q1_2001" 1.964 MB 60608 rows
    . . imported "SH"."SALES":"SALES_Q2_2001" 2.050 MB 63292 rows
    . . imported "SH"."SALES":"SALES_Q3_1999" 2.166 MB 67138 rows
    Processing object type TABLE_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."REGIONS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."REGIONS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."COUNTRIES" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."COUNTRIES" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."LOCATIONS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."LOCATIONS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."DEPARTMENTS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."DEPARTMENTS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOBS" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOBS" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."EMPLOYEES" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."EMPLOYEES" TO "EXAM_03"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'USER1' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOB_HISTORY" TO "USER1"
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01917: user or role 'EXAM_03' does not exist
    Failing sql is:
    GRANT SELECT ON "HR"."JOB_HISTORY" TO "EXAM_03"
    ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type OBJECT_GRANT:"OE" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: Dependent object type INDEX:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_LNAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"OE"."CUST_EMAIL_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type INDEX:"PM"."PRINTMEDIA_PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_CREDIT_LIMIT_MAX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMER_ID_MIN" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"OE"."CUSTOMERS_PK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"PM"."PRINTMEDIA__PK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type CONSTRAINT:"IX"."SYS_C005192" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUSTOMERS_PK" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_ACCOUNT_MANAGER_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_LNAME_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_EMAIL_IX" creation failed
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"PM"."PRINTMEDIA_PK" creation failed
    Processing object type TABLE_EXPORT/TABLE/COMMENT
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type COMMENT skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    ORA-39112: Dependent object type REF_CONSTRAINT:"OE"."CUSTOMERS_ACCOUNT_MANAGER_FK" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39083: Object type REF_CONSTRAINT failed to create with error:
    ORA-00942: table or view does not exist
    Failing sql is:
    ALTER TABLE "OE"."ORDERS" ADD CONSTRAINT "ORDERS_CUSTOMER_ID_FK" FOREIGN KEY ("CUSTOMER_ID") REFERENCES "OE"."CUSTOMERS" ("CUSTOMER_ID") ON DELETE SET NULL ENABLE
    ORA-39112: Dependent object type REF_CONSTRAINT:"PM"."PRINTMEDIA_FK" skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    Processing object type TABLE_EXPORT/TABLE/TRIGGER
    ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."SECURE_EMPLOYEES" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
    ORA-39082: Object type TRIGGER:"HR"."UPDATE_JOB_HISTORY" created with compilation warnings
    Processing object type TABLE_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    ORA-39112: Dependent object type INDEX:"OE"."CUST_UPPER_NAME_IX" skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    ORA-39112: Dependent object type INDEX_STATISTICS skipped, base object type INDEX:"OE"."CUST_UPPER_NAME_IX" creation failed
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"OE"."CUSTOMERS" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"PM"."PRINT_MEDIA" creation failed
    ORA-39112: Dependent object type TABLE_STATISTICS skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Processing object type TABLE_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX
    Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCACT_INSTANCE
    ORA-39112: Dependent object type PROCACT_INSTANCE skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39083: Object type PROCACT_INSTANCE failed to create with error:
    ORA-01403: no data found
    ORA-01403: no data found
    Failing sql is:
    BEGIN
    SYS.DBMS_AQ_IMP_INTERNAL.IMPORT_SIGNATURE_TABLE('AQ$_ORDERS_QUEUETABLE_G');COMMIT; END;
    Processing object type TABLE_EXPORT/TABLE/POST_INSTANCE/PROCDEPOBJ
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_V" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_N" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE_R" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."AQ$_ORDERS_QUEUETABLE_E" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    ORA-39112: Dependent object type PROCDEPOBJ:"IX"."ORDERS_QUEUE" skipped, base object type TABLE:"IX"."ORDERS_QUEUETABLE" creation failed
    Job "SYSTEM"."SYS_IMPORT_TABLESPACE_01" completed with 63 error(s) at 18:09:14

    Short of trying to then reverse-engineer the objects that are in the dump file (I believe Data Pump export files contain some XML representations of DDL in addition to various binary bits, making it potentially possible to try to scan the dump file for the object definitions), I would tend to assume that the export didn't include those type definitions.
    Since it looks like you're trying to set up the sample schemas, is there a reason that you wouldn't just run the sample schema setup scripts on the destination database? Why are you using Data Pump in the first place?
    Justin

  • Setting the default default tablespace and default temporary tablespace

    Is there a way to set the default default tablespace and default temporary tablespace database wide? I want to assign default and temporary tablespaces other than SYSTEM to all new users without having to explicitly define the tablespace names.

    Hi,
    Well, I assume that OP is using 10g, but exactly what you said, in Oracle 9i, it became possible to specify the default temporary tablespace in the database. In Oracle 10g, it allows all users to set up the default permanent tablespace to set the permanent tablespace used by default. Before Oracle 10g, if the default permanent tablespace is not set up when creating the user, SYSTEM tablespace will be set up as the default permanent tablespace by default.
    Cheers

  • So, my phone won't connect to my mac computer when I plug the USB charger into my computer and then into my phone it won't charge or connect. How should I fix this?

    So, my phone won't connect to my mac computer when I plug the USB charger into my computer and then into my phone it won't charge or connect. How should I fix this?

    Test with another computer and another cable
    To discover if the problem is eith the phone the cable or computer

  • Move large tables and indexes into own tablespace

    I currently manage a 100Gb 10.2.0.4 SE database on Windows.
    There is one data tablespace and one indexes.
    I have one table xxxHistory that is periodically cleared out, however, six months of data must be retained.
    The table is currently 17Gb and has 95 million rows, the corresponding nine or so indexes take another 47Gb.
    I am having a small problem with I/O waits on this table so, I want to move this table and the indexes to their own tablespaces, which I will create (xxxHistory_D and xxxHistory_I).
    I know the two methods, exp/imp (difficult due to foreign keys) and the prefered method of :
    alter table tbl move tablespace tblsp and
    alter index ind rebuild tablespace tblsp.
    I have no problems with the syntax etc, having used this method many times.
    My question is, does anyone have a better idea of how to approach this to minimise downtime?
    The system cannot be used if this table is not available.
    I am also going to migrate to 11.2x when available but can't find anything in the new features to help.
    Note, this is SE, so partitioning is not an option and once I have sorted this table out, I will unchain the rows of any other and reorganise the space.
    Disk space is not an issue.
    Thanks,

    BigPhil wrote:
    Note, this is SE, so partitioning is not an option and once I have sorted this table out, I will unchain the rows of any other and reorganise the space.
    Disk space is not an issue.
    Strategically this sounds as if you really do need partitioning; since you're on SE, you could consider the v7 "partition view" concept.
    Create one table per month, and index each table separately.
    Add a constraint to each table of the form: "movement date between to_date('...') and to_date('...')"
    Create a union all view of the tables.
    Getting rid of a single month means redefining the view.
    In theory any queries you do should be able to filter predicate pushdown and thus eliminate redundant partitions, and also handle pushing joins down into the union all view. But that's something you would have to test carefully.
    You could even create the tables as index organized tables - which may be the solution to your I/O wait problems - if your queries are about stock movement then all the movements for a given stock will be thinly scattered across the table, leading to one block I/O per row required. IOTs would give you an overhead on inserts, but eliminate waits on queries.
    Regards
    Jonathan Lewis
    http://jonathanlewis.wordpress.com
    http://www.jlcomp.demon.co.uk
    To post code, statspack/AWR report, execution plans or trace files, start and end the section with the tag {noformat}{noformat} (lowercase, curly brackets, no spaces) so that the text appears in fixed format.
    "Science is more than a body of knowledge; it is a way of thinking"
    Carl Sagan                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Where should I start to import my table of data to FLA?

    Good Morning,
    I am  cartographer/graphics specialist by trade but am working to apply  these  skills to flash.  I've had great luck so far moving my projects  out of  Illy and making them interactive through AS3. 
    I am at the point where I'd  like to incorporate tables of attribute  data specific to my movie  clips.  For example, tables of demographic  info specific to US  counties.  I'd like to make these data queriable by  clicking on a  specific county (mc).  I've been reading some tutorials  and it sounds  like I should convert my data to xml (from a CSV or XLS)  and then import  the XML to my stage and start learning the ins/outs of  calling on its  values. 
    Am  I starting down the right path?  First off, I've not found a very   straightforward way to get my data into XML.  It originates in a DBF and   i could easily convert it to tab-del text, csv or anything else.  Any   suggestions? 
    Also,  should i rethink where I am starting with this and not use XML   altogether, maybe using something else? 
    thanks in advance.... 
    -josh

    for relatively simple data using an xml file is the way to proceed.  for more complex (or extensive) data, using a sql database is the way to proceed.

  • Stored proc to pull data from INV/OM/ASCP and load into a user interface

    i'm trying to pull the data from inv, om and ascp into a user interface ? help me write a stored proc to write 3 procedures, with 3 custom tables for the 3 diff source systems i.e., OM/INV/ASCP and also provide these 3 programs in the request set so that when request set is run, it runs the programs one after the other.
    -please help me with a simple step by step process in doing so ?
    Thanks in advance !!
    Edited by: 1003397 on May 20, 2013 3:59 PM

    i'm trying to pull the data from inv, om and ascp into a user interface ? help me write a stored proc to write 3 procedures, with 3 custom tables for the 3 diff source systems i.e., OM/INV/ASCP and also provide these 3 programs in the request set so that when request set is run, it runs the programs one after the other.
    -please help me with a simple step by step process in doing so ?
    Thanks in advance !!
    Edited by: 1003397 on May 20, 2013 3:59 PM

  • How to get the duplicate rows in dynamically generate data table [list items collection] and send emails in sharepoint2010

    Hi,
    i have share point list like  below
    ID   name AdminEmail Useremail   URl   DueDate   UploadSatus
    1    ppp     [email protected]  [email protected]    url  some date    uploaded
    2    yyy       [email protected]   [email protected]   url somedate    empty
    3  xxx         [email protected]   [email protected]    url   somedate   empty
    4  jjj           [email protected]    [email protected]  url     somedate   emp
    AdminEmail and UserEmail  are lookup column
    i using query the list using caml query
    inside of foreach i am checking  two condition like below
    one is upload status in not empty
    i need to send to mail to admin user  this part i have done my adding all list items which have datatable apply group by working fine
    in send condition i am checking difference between DueDate And current date value
    if the value is =1 or -1
    if the value is i
    thank
    i am getting the
    table like below
    ID   name AdminEmail Useremail   URl   DueDate   Upload
    2    yyy       [email protected]   [email protected]   url   somedate    empty
    3  xxx         [email protected]   [email protected]    url   somedate   empty
    4  jjj           [email protected]    [email protected]  url     somedate   empty
    my issue is here  how can i get the dynamic table rows which are same values of AdminEmail and user email  one set and distintict rows are another set
     sets which are same emails are same
    3  xxx         [email protected]   [email protected]    url   somedate   empty
    4  jjj           [email protected]    [email protected]  url     somedate   empty
    set 2
    2    yyy       [email protected]   [email protected]   url   somedate    empty
     how can i get this separate this can any one tell i need to send mail only one time to user [adim and user] .planing to aviod duplicate mail
    Srinivas

    your case better to use the two data tables to store the data
    DataTable dt = list.Items.GetDataTable();
    foreach (DataRow row in dt.Rows)

  • Hide confidential data tables from SYS dba and applicaiton owners

    Hello All,
    I have been developing a application which is very confidential for higer level managers. I would like my tables, views infact all the data are accessed by my clinet form and should not be accessed using SQL PLUS or any other tool .Is there any way of restricting the users including SYS, Appowner to the tables which stores these confidential data using any data base view tools.
    /Best Regards..
    Prashantha

    Hi,
    You can see [Creating Secure Application Roles to Control Access to Applications|http://download.oracle.com/docs/cd/B28359_01/network.111/b28531/app_devs.htm#i1006262]
    From [10 Keeping Your Oracle Database Secure|http://download-uk.oracle.com/docs/cd/B28359_01/network.111/b28531/guidelines.htm]:
    Use secure application roles to protect roles that are enabled by application code.
    Secure application roles allow you to define a set of conditions, within a PL/SQL package, that determine whether a user can log on to an application. Users do not need to use a password with secure application roles.
    Another approach to protecting roles from being enabled or disabled in an application is the use of role passwords. This approach prevents a user from directly accessing the database in SQL (rather than the application) to enable the privileges associated with the role. However, Oracle recommends that you use secure application roles instead, to avoid having to manage another set of passwords.
    Regards,
    Edited by: Walter Fernández on Jul 2, 2009 12:46 PM - Adding other url...

  • Import tablespace and index seperately

    dear gurus,
    i created two tablespace testtbs, and testtbs_idx. so when i try to import the dump file. so complete file is going to tablespace.
    can you suggest advised as starting or beginner level what best practice should i do. so that both goes seperately.
    there is user test/test@orcl. the file is also test.dump
    i am not using parameter file rather it is single file command line issue. i am using rhel 5.3 with 10.2.0.4 10gr2.
    confirm another thing, at the time of creation user i give user test to tablespace test permission already. but tablespce index i have not given any permission.
    look at below and confirm if i miss or wrong anything...advise accordingly
    CREATE USER test IDENTIFIED BY test DEFAULT TABLESPACE PREMIA QUOTA UNLIMITED ON test TEMPORARY TABLESPACE TEMP;
    GRANT CONNECT,RESOURCE,IMP_FULL_DATABASE,EXP_FULL_DATABASE TO test;
    REVOKE UNLIMITED TABLESPACE FROM test;
    ALTER USER test QUOTA UNLIMITED ON test_IDX;
    imp test/test@orcl file='D:/bkup_dump/test.dmp' log='D:/bkup_dump/test.log' ignore=y FULL=Y;
    regards
    salim

    hello,
    CREATE USER test IDENTIFIED BY test DEFAULT TABLESPACE PREMIA
    QUOTA UNLIMITED ON test TEMPORARY TABLESPACE TEMP;is it like that you are creating user test, default tablespace premia and quota on test???? you should allow quota to user on its default tablespace only so try
    CREATE USER test IDENTIFIED BY test DEFAULT TABLESPACE test
    QUOTA UNLIMITED ON test TEMPORARY TABLESPACE TEMP;before executing this
    ALTER USER test QUOTA UNLIMITED ON test_IDX;
    have you executed this???
    ALTER USER TEST DEFAULT TABLESPACE TEST;
    ?????? you should do that.
    and import complete dump in users default tablespace and then rebuild all indexes to index tablespace, that will be easiest way....experts please correct me...
    alter index <index_name> rebuild online tablespace <new_tablespace>;
    import command is looking fine
    imp test/test@orcl file='D:/bkup_dump/test.dmp' log='D:/bkup_dump/test.log' ignore=y FULL=Y;you might like to include fromuser and touser clause if the user exported is different...and why using full?? are you importing full database???
    HTH
    thanks and regards
    VD

  • Relation between temp tablespace and index creation

    Hi,
    I have my Oracle database (11gR1) on windows 2008 server R1 64 bit..
    This is my development database. i have one table which has more than 2 billion rows , the problem i m facing here is while creating the index on this table i m getting temp segment error , while my temp tablespace size is 32 gb.
    Here my doubt is :
    1.What will happen in temp tablespace when index is created ? Relation between temp and index creation ?
    2. how to create the index on a huge table?
    3. What is the meaning og logging and no logging in INDEX creation .
    4. how can we over come for these kind of problem and manage the temp TS..
    Thanks & Regards,
    Vikash Chauradia

    add another tempfile?
    1.What will happen in temp tablespace when index is created ? Relation between temp and index creation ?
    index creation needs sort. how much depends on the size of the index.
    2. how to create the index on a huge table?
    create an interim (temporary? :)) huge temporary space for the very purpose.
    http://docs.oracle.com/cd/B28359_01/server.111/b28310/indexes003.htm#i1006643
    3. What is the meaning og logging and no logging in INDEX creation .
    nologging means you the creation isnt in the logs so if you need to recover you cant get back to it. when using nologging in a prod env you might do it for performance during a period of heavy dml such as a large index creation and then backup afterwards. common enough.
    4. how can we over come for these kind of problem and manage the temp TS..
    current tempspace size =X
    is X big enough? if yes, cup of tea, if no, make X bigger.
    It doesnt matter what X is.

  • Explain local manage tablespace and dictionary manage tablespace

    hi all,
    kindly help me to understand local manage and dictionary manage tablespace
    i have read search result and oracle corporation book still unable to understand
    what i know that local is managed by bitmap(no redo) and dictionary is managed by dictionary(generates redo) plz explain this as well
    and also suggest me some documentation.
    thanks
    Navin

    Navin,
    These are excerpts from Oracle documentation
    Dictionary Managed Tablespace_
    If you created your database with an earlier version of Oracle, then you could be using dictionary managed tablespaces. For a tablespace that uses the data dictionary to manage its extents, Oracle updates the appropriate tables in the data dictionary whenever an extent is allocated or freed for reuse. Oracle also stores rollback information about each update of the dictionary tables. Because dictionary tables and rollback segments are part of the database, the space that they occupy is subject to the same space management operations as all other data.
    Lcaolly Managed Tablesapce_
    A tablespace that manages its own extents maintains a bitmap in each datafile to keep track of the free or used status of blocks in that datafile. Each bit in the bitmap corresponds to a block or a group of blocks. When an extent is allocated or freed for reuse, Oracle changes the bitmap values to show the new status of the blocks. These changes do not generate rollback information because they do not update tables in the data dictionary (except for special cases such as tablespace quota information).
    Locally managed tablespaces have the following advantages over dictionary managed tablespaces:
    Local management of extents automatically tracks adjacent free space, eliminating the need to coalesce free extents.
    Local management of extents avoids recursive space management operations. Such recursive operations can occur in dictionary managed tablespaces if consuming or releasing space in an extent results in another operation that consumes or releases space in a data dictionary table or rollback segment.
    The sizes of extents that are managed locally can be determined automatically by the system. Alternatively, all extents can have the same size in a locally managed tablespace and override object storage options.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/physical.htm#sthref518
    Regards

Maybe you are looking for