"Out of memory" while importing AVI  type (AJA) file with AE CS3

With After Effects CS3 (version 8.0.2.27) when we import an AVI type (AJA) file greater than 2Gigs captured by an another PC with a Xena board AJA , a message window appears : After Effects: Out of memory.(945949kK requested) (23 :: 40). If we import the same file with After Effects CS2 (version: 7.0.1) we have no problem. If we import an AVI (AJA) file smaller than 2 Gigs with AE CS3 no problem
appears. If we capture an AVI type (Matrox) file greater than 2Gigs and import this one with AE CS3 no message appears. If we import a MOV type (AJA) file greater than 2 Gigs no problem appears with AE CS3. So to bypass this problem we are working with footage captured in MOV type file AJA. The PC which the AE CS3 is running has 4 Gigs of RAM with 2 Xeon Quad CPU.
Thanks
Marc

Most likely an issue with MediaCore trying to take over and then not getting it right... Have you checked with AJA? maybe they know something about it.
Mylenium

Similar Messages

  • Getting Error Out Of Memory while importing the work repository in ODI 10g

    I exported the work repository from topology of ODI 10g of one DB and tried importing it in the another ODI 10g topology of another DB.While importing i got the error 'Out of Memory' .
    Can somebody suggest me ,how to solve the heap size out of memory issue while importing in ODI 10g.
    Thanks in Advance.

    Hi,
    you have to post your question in ODI forum
    Data Integrator
    Suresh

  • IMac 500GB HDD occupied by 499.25 "other" files while all other types of files show "0 KB", what gives?

    iMac 500GB HDD occupied by 499.25 GB "other" files while all other types of files show "0 KB", what gives?
    I've tried cleaning up the HDD from HD tool twice and pressed command-option-P-R while restarting the Mac afterwards. The avaliable HDD space "improved" from 1.3 MB to 121 MB after the first restart and descresed down to 119MB after the second restart. I have kept very little AV files as well as doucment files. My application programs are just several GB. Is this a system problem. Do I have to reinstall the system?
    Much appreciate any advice!!
    BTW Apple customer service is on HOILDAY these two days from where I'm staying...

    You may find this Discussioon of interest and value...
    https://discussions.apple.com/message/18595469#18595469
    From the " More Like This " on the right.

  • While downlaoding any type of file instead of showing the file size 'unknown time remaining' is showing?

    While downlaoding any type of file instead of showing the file size 'unknown time remaining' is showing? Please suggest solution to these problems?

    I followed ALL the Above suggestions to NO AVAIL!
    When I start a download, if I then Click PAUSE, it WILL NOT CANCEL & IT WILL NOT RESUME, just sits there stating "unknown time remaining"
    and I CAN'T GET RID OF IT!
    It would seem that Something Changed with the latest Firefox Beta Update that is causing this Anomalie.. Please FIX ASAP as I find this Very Annoying.
    With ANY Download in Limbo the Firefox Taskbar Icon displays a Partial Download Progress THAT NEVER CHANGES and I can not tell what Other Downloads are doing - It also Warns me that X-Number of Downloads are Still Pending when I go to Close Firefox, but as I stated, these Downloads are DOING NOTHING & CAN'T BE REMOVED!
    HELP! - BJ

  • Out of memory error importing a JPA Entity in WebDynpro Project

    Hi All!
    We are having problems importing JPA entities in a WebDynPro project, this is our escenario.
    We have two entities, entity A that has a ManyToOne relationship with entity B and at the same time entity B has a OneToMany relationship with entity A. When in the controller context we try to create a node usign a model binding to Entity A we got an Out of memory error from NetWeaver DS. Trying to figure out the problem we identified that in the model that we imported we got the following.
    Entity A
        Entity B
            Entity A
               Entity B
                  and so on....... without and end
    What are we doing wrong? Or how can we avoid these behavior?
    Regards,

    Hi Kaylan:
    Thanks for your reply. You are rigth about the error that we are getting. This is our scenario.
    We have a ejb that in some of his method uses the entity Lote, and this entity has a relationship with entity Categoria. When we import the EJB using the model importer netweaver imports the EJB methods and the entities that they use.
    So after doing this besides the ejb's methods we got these two entities that are the ones that are generating the error, when we try to create a context node using the Categoria entity.
    @Entity
    @Table(name="TB_LOTE")
    public class Lote implements Serializable {
         @EmbeddedId
         private Lote.PK pk;
         @ManyToOne
         @JoinColumn(name="CO_CATEGORIALOTE")
         private Categoria coCategorialote;
         @Embeddable
         public static class PK implements Serializable {
              @Column(name="CO_LOTE")
              private String coLote;
              @Column(name="CO_ORDENFABRICACION")
              private String coOrdenfabricacion2;
                   ^ this.coOrdenfabricacion2.hashCode();
    @Entity
    @Table(name="TB_CATEGORIA")
    public class Categoria implements Serializable {
         @Id
         @Column(name="CO_CATEGORIA")
         private String coCategoria;
         @OneToMany(mappedBy="coCategorialote")
         private Set<Lote> tbLoteCollection;
    Regards,
    Jose Arango

  • ORA-27201 Out of Memory while installing 8.1.6

    I just recieved a new Solaris 2.7 machine. I am trying to install Oracle Enterprise 8.1.6 (and also tried 8.1.7). I try Installing the just the most simple install - with the basic db. In doing so the db creation fails 'Oracle not available'. If
    I ignore the problem and continue the install, I then can't start Oracle afterwards. In srvmgrl I run startup and I get error 27201 - Out of Memory. A check of swap and memory shows plenty available.

    Check your total SGA size, check your total physical Memory and swap space configured,then check kernel parameters for Shared Memory ,shmmax,shmseg,shmmni (all are important) By default maximum SGA size available is 1.7GB ,u can alter that.

  • Out of Memory while running report

    Hi All,
    DB Version :- 10.1.0.5.0
    Our applications is used to genrate reports, where one report has limitations on the date range. This was due to 'out of memory' error which occured before and it was taking huge time to complete before. Now we have tune the SQL and its completeing in mins (performance increased by 80%) :-)
    Now we wanted to remove the 10 year limit and run for past 30 years data. when we reomve limitation it throws the below error message.
    "when others Error ORA-04030: out of process memory when trying to allocate 1872664 bytes (PLS non-lib hp,DARWIN) with query in ....package" .
    While running the report i asked DBA to check the usage and he mentioned that its at peak. Only one report using 42 GB of RAM.
    Can anyone provide me how to tackle the issue/start investigation to conclude the reason for the issue.
    When the report runs there some dynamic SQL's generated and report is completed.
    Regards,
    Sunny

    Hi All,
    DB Version :- 10.1.0.5.0
    I got the dynamic SQL's and from those found that this piece of SQL is taking huge time. It takes around 15 mins to complete.
    Below is the SQL
    SELECT
    X.TIME_PERIOD EXPOSURE_PERIOD, Y.TIME_PERIOD EVALUATION_PERIOD,b.BUSINESS_UNIT,
    decode(GROUPING(LOB_VALUE),1,'SEL',LOB_VALUE) BUSINESS_UNIT_LOB_ID_ACT,
    0 CALC_VALUE
    FROM
    ACTUARIAL_REF_DATA.TIME_PERIOD_HIERARCHY X, ACTUARIAL_REF_DATA.TIME_PERIOD_HIERARCHY Y,
    ANALYSIS_BUSINESS_UNITS B, ANALYSIS_LOBS L
    WHERE
    B.ANALYSIS_ID = L.ANALYSIS_ID
    AND X.TIME_PERIOD BETWEEN TO_NUMBER('198001') AND TO_NUMBER('201006')
    AND Y.TIME_PERIOD BETWEEN TO_NUMBER('198001') AND TO_NUMBER('201006')
    AND b.BUSINESS_UNIT='31003'
    AND LOB_VALUE IN (SELECT TO_NUMBER(LOB_VALUE) FROM ANALYSIS_LOBS WHERE ANALYSIS_ID=TO_NUMBER('3979'))
    GROUP BY X.TIME_PERIOD, Y.TIME_PERIOD,BUSINESS_UNIT,CUBE(LOB_VALUE)
    PLAN_TABLE_OUTPUT
    Plan hash value: 929111431
    | Id  | Operation                    | Name                       | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT             |                            |    26M|   996M|       |   203K  (3)| 00:47:35 |
    |   1 |  SORT GROUP BY               |                            |    26M|   996M|       |   203K  (3)| 00:47:35 |
    |   2 |   GENERATE CUBE              |                            |    26M|   996M|       |   203K  (3)| 00:47:35 |
    |   3 |    SORT GROUP BY             |                            |    26M|   996M|  2868M|   203K  (3)| 00:47:35 |
    |*  4 |     HASH JOIN                |                            |    35M|  1324M|       |  6694  (17)| 00:01:34 |
    |*  5 |      INDEX RANGE SCAN        | ANALYSIS_LOBS_PK           |    48 |   480 |       |     2   (0)| 00:00:01 |
    |*  6 |      HASH JOIN               |                            |   148M|  4097M|       |  5619   (1)| 00:01:19 |
    |   7 |       TABLE ACCESS FULL      | ANALYSIS_LOBS              | 24264 |   236K|       |    12   (0)| 00:00:01 |
    |   8 |       MERGE JOIN CARTESIAN   |                            |  3068K|    55M|       |  5584   (1)| 00:01:19 |
    |   9 |        MERGE JOIN CARTESIAN  |                            |  8401 |   114K|       |    20   (0)| 00:00:01 |
    |* 10 |         INDEX FAST FULL SCAN | ANALYSIS_BUSINESS_UNITS_PK |    23 |   207 |       |     3   (0)| 00:00:01 |
    |  11 |         BUFFER SORT          |                            |   365 |  1825 |       |    17   (0)| 00:00:01 |
    |* 12 |          INDEX FAST FULL SCAN| TIME_PERIOD_HIERARCHY_PK   |   365 |  1825 |       |     1   (0)| 00:00:01 |
    |  13 |        BUFFER SORT           |                            |   365 |  1825 |       |  5583   (1)| 00:01:19 |
    |* 14 |         INDEX FAST FULL SCAN | TIME_PERIOD_HIERARCHY_PK   |   365 |  1825 |       |     1   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       4 - access(TO_NUMBER("LOB_VALUE")=TO_NUMBER("LOB_VALUE"))
       5 - access("ANALYSIS_ID"=3979)
       6 - access("B"."ANALYSIS_ID"="L"."ANALYSIS_ID")
      10 - filter("B"."BUSINESS_UNIT"='31003')
      12 - filter("X"."TIME_PERIOD">=198001 AND "X"."TIME_PERIOD"<=201006)
      14 - filter("Y"."TIME_PERIOD">=198001 AND "Y"."TIME_PERIOD"<=201006)
    no.of rows returing :- 58404816
    Desc of the tables
    TABLE TIME_PERIOD_HIERARCHY
      TIME_PERIOD        NUMBER(6),
      QUARTER_DESC       VARCHAR2(6 CHAR),
      QUARTER_START      NUMBER(6),
      QUARTER_END        NUMBER(6),
      SEMI_ANNUAL_DESC   VARCHAR2(12 CHAR),
      SEMI_ANNUAL_START  NUMBER(6),
      SEMI_ANNUAL_END    NUMBER(6),
      YEAR               NUMBER(4),
      YEAR_START         NUMBER(6),
      YEAR_END           NUMBER(6)
    TABLE ANALYSIS_LOBS
      ANALYSIS_ID  NUMBER(10),
      LOB_TYPE_ID  NUMBER(1),
      LOB_VALUE    VARCHAR2(7 CHAR)
    TABLE ANALYSIS_BUSINESS_UNITS
      ANALYSIS_ID    NUMBER(10),
      BUSINESS_UNIT  VARCHAR2(5 CHAR)
    )Kindly let me know if there is any point where we can improve the performance.
    Regards,
    Sunny
    Edited by: k_17 on Nov 22, 2011 2:47 PM

  • "General failure: Out of memory" when importing clip

    Hi,
    I am seeing the strangest problem:
    - I add the final clip to my project, and double-click it to spawn it in the viewer. The response is a dialog box saying "General failure", and after that one saying "Out of memory". I checked all the memory settings, disk space is abundant. I restart the project (and the macpro), I re-render the clip, which comes from Camtasia, I even export it from QuickTime pro to get a NEW file with most likely no errors in it. I even switched the audio track between little- and big-endian
    But this particular clip just won't work.
    Any other sequence I use, do just fine. They are all exported from Camtasia Studio; a Windows screen-capture application, as Quicktime files using the animation codec.
    My last suspicion is that the frame rate is slightly low. So I am forcing it up to see if that helps. God speed.
    If anybody has a clue, has seen this before, please help.
    Cheer and best regards;
    Edvin

    OK, so I found a solution that works, but still Id consider this a bug...
    The source clip was originally at a framerate around 0.5 (it was a screendump of a powerpoint presentation, with voiceover). I forced the framerate up above 1.1 by fiddling FPS and keyframe frequency.
    The first 2 seconds after import was the same; the clip does not get any Duration, In or Out values in the media browser. Then they appear after a few secs, and FCp seem tohave accepted the media file. Weird, but now at least I got over it, and I am just leaving this remark here if others come across the same: force up the playing FPS - somehow it seems to work.
    Cheers;
    Edv!n

  • Getting ORA-27102: out of memory while creating DB using DBCA

    Hi All,
    I am working on 11.2.0.3 oracle version and linux OS. I am trying to create a new database using dbca and getting error "ORA-27102: out of memory".
    Please find the DB version and OS level parameters info below and let me know what i need to do to overcome this issue.
    SQL> select * from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE 11.2.0.3.0 Production
    TNS for Linux: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    $uname -a
    Linux greenlantern1a 2.6.18-92.1.17.0.1.el5 #1 SMP Tue Nov 4 17:10:53 EST 2008 x86_64 x86_64 x86_64 GNU/Linux
    $cat /etc/sysctl.conf
    kernel.shmall = 2097152
    kernel.shmmax = 4294967295
    kernel.shmmni = 4096
    kernel.sem = 250 32000 100 128
    net.core.rmem_default = 4194304
    net.core.wmem_default = 262144
    net.core.rmem_max = 4194304
    net.core.wmem_max = 1048576
    fs.file-max = 6815744
    fs.aio-max-nr = 1048576
    net.ipv4.ip_local_port_range = 9000 65500
    $free -g
    total used free shared buffers cached
    Mem: 94 44 49 0 0 31
    -/+ buffers/cache: 12 81
    Swap: 140 6 133
    $ulimit -l
    32
    $ipcs -lm
    ------ Shared Memory Limits --------
    max number of segments = 4096
    max seg size (kbytes) = 4194303
    max total shared memory (kbytes) = 8388608
    min seg size (bytes) = 1
    Please let me know for any other details.
    Thanks in advance.

    Ok, first, let's set aside the issue of hugepages for a moment. (Personally, IMHO, if you're doing manual memory mangement, and you're not using hugepages, you're doing it wrong.)
    Anyhow, looking at your SHM parameters:
    kernel.shmall = 2097152
    kernel.shmmax = 4294967295
    kernel.shmmni = 4096
    Let's take those in reverse order:
    1.) shmmni - This is the max number of shared memory segments you can have on your system, regardless of the size of each segment.
    2.) shmmax - Contrary to popular belief, this is NOT the max amount of shared memory you can allocate system wide! This is the max size, in bytes of a single shared memory segment. You currently have it set to 4GB-1. This is probably fine. Even if you wanted an SGA larger than 4GB, having shmmax set to this wouldn't hurt you. Oracle would simply allocate multiple shared memory segments, until it had allocated enough memory for the SGA. There's really no harm there, unless this parameter is set really low, causing a huge number of tiny shared memory segments to be allocated.
    3.) shmall - This is the real shared memory segment limit. This number is the total amount of shared memory you're permitted to allocate, system wide, expressed in pages. Pagesize here is the native OS pagesize, which is 4096 bytes, so, this is 2097152 * 4096 = 8589934592, or, 8GB. So, 8GB is the maximum amount of memory that can currnetly be allocated to shared memory, on your machine.
    So, having said all that, you haven't mentioned how many, if any, other Oracle databases are running on the server or their sizes. Secondly, we have no idea what memory sizing parameters you have set on the database that you're trying to create, that's getting the error.
    So, if you can provide more details, in terms of how many other databases are already on this server, and their SGA sizes, and the parameters you've chosen for the database that's failing to create, perhaps we can help more.
    Finally, if you're not using SGA_TARGET or MEMORY_TARGET, you really need to take the time to configure hugepages. Particularly if you've got a server that has as much memory as you do, and you're planning to have non-trivially sized SGA (10s of GB), then you really want to configure hugepages.
    Hope that helps,
    -Mark

  • ORA-27102: out of memory (while creation of drsite problem)

    Hi all,
    I am trying to create DRSITE at remote location, but whilw using the pfile of primary server i am getting the error ORA-27102: out of memory we are using Oracle 9.2 and RHEL and another is that in the primary server we are haing oracle 9.2.0.8 and at the drsite we are using oracle 9.2.0.6,actually aour patch got corrupted that's why we are using oracle 9.2.0.6, because of the differences os the patch creating a problem.....but i dno't think so..pls correct me if i am wrong
    SQL> conn sys/pwd as sysdba
    Connected to an idle instance.
    SQL> startup nomount pfile='/u01/initicai.ora';
    ORA-27102: out of memory
    SQL>we are haing total 8gb memory out of which we using 6gb for oracle i.e
    [oracle@icdb u01]$ cat /proc/meminfo
    MemTotal:      8175080 kB
    MemFree:         39912 kB
    Buffers:         33116 kB
    Cached:        7780188 kB
    SwapCached:         32 kB
    Active:          78716 kB
    Inactive:      7761396 kB
    HighTotal:           0 kB
    HighFree:            0 kB
    LowTotal:      8175080 kB
    LowFree:         39912 kB
    SwapTotal:    16779884 kB
    SwapFree:     16779660 kB
    Dirty:              28 kB
    Writeback:           0 kB
    Mapped:          48356 kB
    Slab:           265028 kB
    CommitLimit:  20867424 kB
    Committed_AS:    61372 kB
    PageTables:       2300 kB
    VmallocTotal: 536870911 kB
    VmallocUsed:    271252 kB
    VmallocChunk: 536599163 kB
    HugePages_Total:     0
    HugePages_Free:      0
    Hugepagesize:     2048 kB
    and
    [oracle@icdb u01]$ cat /etc/sysctl.conf
    # Kernel sysctl configuration file for Red Hat Linux
    # For binary values, 0 is disabled, 1 is enabled.  See sysctl(8) and
    # sysctl.conf(5) for more details.
    # Controls IP packet forwarding
    net.ipv4.ip_forward = 0
    # Controls source route verification
    net.ipv4.conf.default.rp_filter = 1
    # Do not accept source routing
    net.ipv4.conf.default.accept_source_route = 0
    # Controls the System Request debugging functionality of the kernel
    kernel.sysrq = 0
    # Controls whether core dumps will append the PID to the core filename.
    # Useful for debugging multi-threaded applications.
    kernel.core_uses_pid = 1
    kernel.shmall=2097152
    kernel.shmmax=6187593113
    kernel.shmmni = 4096
    kernel.sem = 250 32000 100 128
    fs.file-max =65536
    net.ipv4.ip_local_port_range = 1024 65000
    [oracle@icdb u01]$and bash profile is
    PATH=$PATH:$HOME/bin
    ORACLE_BASE=/u01/app/oracle
    ORACLE_HOME=$ORACLE_BASE/product/9.2.0
    ORACLE_SID=ic
    LD_LIBRARY_PATH=$ORACLE_HOME/lib:$ORACLE_HOME/lib32
    PATH=$PATH:$ORACLE_HOME/bin
    export  ORACLE_BASE ORACLE_HOME ORACLE_SID LD_LIBRARY_PATH PATH
    export PATH
    unset USERNAME
    ~
    ~please suggest me...

    init file
    [oracle@icdb u01]$ cat initicai.ora
    *.aq_tm_processes=1
    *.background_dump_dest='/u01/app/oracle/admin/ic/bdump'
    *.compatible='9.2.0.0.0'
    *.control_files='/bkp/data/ctl/control03.ctl'
    *.core_dump_dest='/u01/app/oracle/admin/ic/cdump'
    *.db_block_size=8192
    *.db_cache_size=4294967296
    *.db_domain=''
    *.db_file_multiblock_read_count=16
    *.db_name='icai'
    *.dispatchers='(PROTOCOL=TCP) (SERVICE=icaiXDB)'
    *.fast_start_mttr_target=300
    *.hash_join_enabled=TRUE
    *.instance_name='icai'
    *.java_pool_size=157286400
    *.job_queue_processes=20
    *.large_pool_size=104857600
    *.open_cursors=300
    *.pga_aggregate_target=938860800
    *.processes=1000
    *.query_rewrite_enabled='FALSE'
    *.remote_login_passwordfile='EXCLUSIVE'
    *.shared_pool_size=818103808
    *.sort_area_size=524288
    *.star_transformation_enabled='FALSE'
    *.timed_statistics=TRUE
    *.undo_management='AUTO'
    *.undo_retention=10800
    *.undo_tablespace='UNDOTBS1'
    *.user_dump_dest='/u01/app/oracle/admin/ic/udump'
    #log_archive_dest='/bkp/arch/ic_'
    Log_archive_start=True
    sga_max_size=6444442450944
    log_archive_dest_1='location=/bkp/arch/ mandatory'
    log_archive_dest_2='service=prim optional reopen=15'
    log_archive_dest_state_1=enable
    remote_archive_enable=true
    standby_archive_dest='/arch/ic/ic_'
    standby_file_management=auto
    [oracle@icdb u01]$Edited by: user00726 on Nov 11, 2009 10:27 PM

  • Ipad since v5 out of memory while running applications

    Since installing v5 on iPad (original 3g - 64, applications that ran OK now get out of
    memory error and screen goes dark, then back to one of the icon screens.

    Since installing v5 on iPad (original 3g - 64, applications that ran OK now get out of
    memory error and screen goes dark, then back to one of the icon screens.

  • Error while importing data into Oracle 11gr2 with arcsde 9.3.1

    I am getting error while importing the data into oracle 11g r2. We are using arcsde 9.3.1
    It seems to be having some problem with spatial index creation.
    kindly help
    IMP-00017: following statement failed with ORACLE error 29855:
    "CREATE INDEX "A3032_IX1" ON "DGN_POLYLINE_2D" ("SHAPE" ) INDEXTYPE IS "MDS"
    "YS"."SPATIAL_INDEX""
    IMP-00003: ORACLE error 29855 encountered
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AAAT5pAA9AACIy5AAQ] in spatial indexing.
    ORA-13206: internal error [] while creating the spatial index
    ORA-13033: Invalid data in the SDO_ELEM_INFO_ARRAY in SDO_GEOMETRY object
    ORA-06512: at "MDSYS

    Guys,
    I am also getting the same error and also my issue is like I am not even to analyze for which indexes I am getting error. It does not hve any indx name before error.
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/DOMAIN_INDEX/INDEX
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    ORA-12801: error signaled in parallel query server P000
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AA
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    ORA-12801: error signaled in parallel query server P002
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AA
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    stack cnt....
    How can I find for which indexes it is failing?
    Thank you,
    Myra

  • Error while importing data from XML file to a Oracle database

    I am getting the following error while importing data
    *** atg.xml.XMLFileException: No XML files were found for "/Dynamusic/config/dynamusic/songs-data.xml". 
    The files must be located  under the name "/Dynamusic/config/dyna
    *** java.io.FileNotFoundException: D:\MyApp\ATG\ATG10.0.3 (Access is denied)
    java.io.FileNotFoundException: D:\MyApp\ATG\ATG10.0.3 (Access is denied)
            at java.io.FileInputStream.open(Native Method)
            at java.io.FileInputStream.<init>(FileInputStream.java:120)
            at atg.adapter.gsa.xml.TemplateParser.importFiles(TemplateParser.java:6675)
            at atg.adapter.gsa.xml.TemplateParser.runParser(TemplateParser.java:5795)
            at atg.adapter.gsa.xml.TemplateParser.main(TemplateParser.java:5241)
    I have placed the FakeXADataSource and JTDataSource properties files as required. In fact I have been able to import data in all machines. Just this one machine is giving problems. I have also checked out the access rights. Can someone help me please?

    Confirm that you have access to D:\MyApp\ATG\ATG10.0.3 and existence of Dynamusic/config/dynamusic/songs-data.xm

  • Error while importing a container dump file

    Hi,
    I am trying to import a container from one Oracle Server to another server which are located at two different locations. I have exported the container from the first server and FTPed it to the second server and tried to import the same into the second server. While doing this, i got the following message:
    Connected to: Oracle8i Enterprise Edition Release 8.1.7.0.0 - Production
    With the Partitioning option
    JServer Release 8.1.7.0.0 - Production
    Export file created by EXPORT:V08.01.07 via conventional path
    import done in WE8ISO8859P1 character set and WE8ISO8859P1 NCHAR character set
    . importing REPOS_MANAGER's objects into REPOS_MANAGER
    . . importing table "XTSYS_EXPORT_OBJECTS" 1 rows imported
    . . importing table "XTSYS_IMPORT_IRID_MAPPING"
    IMP-00009: abnormal end of export file
    IMP-00018: partial import of previous table completed: 1281 rows imported
    IMP-00033: Warning: Table "XTSYS_IMPORT_IVID_MAPPING" not found in export file
    IMP-00033: Warning: Table "XTSYS_TABS_EXPORTED" not found in export file
    IMP-00033: Warning: Table "XTSYS_RM$REPOSITORIES" not found in export file
    Import terminated successfully with warnings.
    I am able to import the same dump file into another server located in the same location as the Server One without any issues.
    One thing that i have observed is: On the Server 2, if a container is already present and the XTSYS_ tables are already populated, when i tried to import this dump file, it asked me "Temporary tables are already populated. Do you want to overwrite them. Otherwise the existing temporaty tables will be used for this purpose". If i clicked No, then it imported the container without any problems, BUT, the new container contents are not the same as found on Server 1's container contents.
    Can somebody suggest what could be the problem and how can i import that dump file.
    thanks a lot in advance
    regards,
    Vijay

    Vijay,
    There is a note <Note:160378.1> on the Oracle Support site: http://metalink.oracle.com.
    There are a number of possibilities, the most likely of these is that you are using the wrong import/export utilities for an older or newer database version.
    You don't mention which release of Designer you are using, Designer 9i uses the 9.0.1 export/import utilities by default. You cannot use the 9.0.1 export/import utilities against an 8.1.7 database.
    If not already installed, install the 8.1.7 export/import utilities from the Oracle 8i Release 3 (8.1.7) Client Cd into a new Oracle Home.
    In the Registry, change [hklm]\software\oracle\HOMEx\repos61\EXECUTE_EXPORT
    and EXECUTE_IMPORT to point to the 8.1.7 export/import utitilites.
    (HOMEx corresponds to your Designer 9i Oracle Home)
    A typical value for them would be:
    EXECUTE_EXPORT=d:\des_817\bin\exp.exe
    EXECUTE_IMPORT=d:\des_817\bin\imp.exe
    or
    EXECUTE_EXPORT=d:\V817\bin\exp.exe
    EXECUTE_IMPORT=d:\V817\bin\imp.exe
    After changing the parameters in the Registry, verify if they are active by going into the Repository Administration Utility, Check Requirements,
    Parameter Settings and look for EXECUTE_EXPORT and EXECUTE_IMPORT.
    Regards
    Sue

  • Error while importing data from shape file in mapBuilder.

    Hi Dears,
    I get the following error while importing shape file from inside mapBuilder
    Feb 03, 2015 2:01:52 PM oracle.mapviewer.builder.wizard.shapefile.ImportShapefileThread importFile
    WARNING: java.sql.BatchUpdateException: Internal Error: Overflow Exception trying to bind NaN
    Record #51 not converted.
    Feb 03, 2015 2:01:52 PM oracle.mapviewer.builder.wizard.shapefile.ImportShapefileThread importFile
    SEVERE: Exception while importing shapefile [D:\GIS_Data\OpenStreetMap\pakistan-latest\waterways]:
    null
    any clue, solution. please.
    Regards.

    I would take a look at the attribute data associated with each object in your shape file.  Make sure that everything has a value, and that nothing is NULL.
    ID   VAR1
    1    abc
         efg   <= this record will cause Mapbuilder to error when importing
    2          <= this record will cause Mapbuilder to error when importing
    3    hij

Maybe you are looking for

  • Is there any particular reason Qt and hasn't been updated? [answered]

    As the subject suggests, I'm curious as to why the latest Qt (and Qt Creator) hasn't been made available via pacman. Both have had newer versions out for a while and have also been flagged as out of date in pacman for probably the same amount of time

  • DHCP NAP not working as it should

    Hello, I'm trying to get NAT DHCP to work in a test enviroment. Everything appears to be ok, but it's not. The health policy is checking if Windows Firewall is turned on. The Remediatio Group consist of only one server. When I turn off the firewall o

  • NetInstall on Mac OS X server 10.4.7

    Hello! I recently have created a Netinstall (based on Tiger 10.4.11) for my laboratory (Aluminium G5 tower PPC). I have Mac OS X Tiger server (10.4.7) running, and the NFS and Netboot service are OK! When the machines boot from the Netinstall, the ne

  • Creating New Entity Instances within OPA

    This is another one that I think I know the answer to, but I think it is best to confirm. In prior versions, it was not possible to create entity instances within RuleBurst/Haley Office Rules. Is that still the case in OPA? For example, my rules are

  • Multiple anchored frames create giant anchor marks

    As shown below, if I put two frame anchors next to each other in my figure caption, the second one shows up with a giant anchor. Anyone else seen this? Anyone know how to get the second one to go back to a normal size? I can insert a space, but that