Large Data file problem in Oracle 8.1.7 and RedHat 6.2EE

I've installed the RedHat 6.2EE (Enterprise
Edition Optimized for Oracle8i) and Oracle
EE 8.1.7. I am able to create very large file
( > 2GB) using standard commands, such as
'cat', 'dd', .... However, when I create a
large data file in Oracle, I get the
following error messages:
create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
extent management local autoallocate;
create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
ERROR at line 1:
ORA-19502: write error on file "/data/u1/db1/data1.dbf", blockno 231425
(blocksize=8192)
ORA-27069: skgfdisp: attempt to do I/O beyond the range of the file
Additional information: 231425
Additional information: 64
Additional information: 231425
Do anyone know what's wrong?
Thanks
david

I've finally solved it!
I downloaded the following jre from blackdown:
jre118_v3-glibc-2.1.3-DYNMOTIF.tar.bz2
It's the only one that seems to work (and god, have I tried them all!)
I've no idea what the DYNMOTIF means (apart from being something to do with Motif - but you don't have to be a linux guru to work that out ;)) - but, hell, it works.
And after sitting in front of this machine for 3 days trying to deal with Oracle's, frankly PATHETIC install, that's so full of holes and bugs, that's all I care about..
The one bundled with Oracle 8.1.7 doesn't work with Linux redhat 6.2EE.
Don't oracle test their software?
Anyway I'm happy now, and I'm leaving this in case anybody else has the same problem.
Thanks for everyone's help.

Similar Messages

  • Oracle 8.1.7 and Redhat 6.2EE SIGSEGV when running netasst

    I've managed to install this ok - at least the core server is fine.
    Unfortunately when I attempt to run almost any of the graphical java admin. utilities, eg oemapp, dbasst, netasst, owm. They fail with something like:
    SIGSEGV received at befffac0 in /usr/X11R6/lib/libXt.so.6. Processing terminated
    Writing stack trace to javacore1268.txt ... OK
    The funny thing is dbasst actually runs about 1% of the times I run it. The rest never do.
    I can run over applications that use X11 fine.
    I am using the JRE that came with the oracle install (IBM1.1.8)
    I have also tried other JREs.
    I have also tried unset LANG (as mentioned in other posts)
    Please help - I'm really stuck.

    I've finally solved it!
    I downloaded the following jre from blackdown:
    jre118_v3-glibc-2.1.3-DYNMOTIF.tar.bz2
    It's the only one that seems to work (and god, have I tried them all!)
    I've no idea what the DYNMOTIF means (apart from being something to do with Motif - but you don't have to be a linux guru to work that out ;)) - but, hell, it works.
    And after sitting in front of this machine for 3 days trying to deal with Oracle's, frankly PATHETIC install, that's so full of holes and bugs, that's all I care about..
    The one bundled with Oracle 8.1.7 doesn't work with Linux redhat 6.2EE.
    Don't oracle test their software?
    Anyway I'm happy now, and I'm leaving this in case anybody else has the same problem.
    Thanks for everyone's help.

  • Oracle 11g unreasonable large data file that can't be shrunk

    I got this Oracle 11g installed on my win7 just for running my local application. In the Enterprise Manager the "USERS" tablespaces shows over 12g for allocated size, about 1g for Space Used(also see the table snippet below), only about 8% of space is used. However, when I tried to shrink the data file/space even to half of its current size, both from command line(resize) and Enterprise Manager, i got the error that says "Failed to commit: ORA-03297: file contains used data beyond requested RESIZE value ". I was able to resize my TEMP tablespace successfully with the same command.
    Any insight on this? thanks a lot. I'm about to run out my hard drive space.
    Name / Allocated Size(MB) /Space Used(MB) /Allocated Space Used(%) /Auto Extend /Allocated Free Space(MB) /Status Datafiles /Type /Extent Management /Segment Management
    USERS / 12,288.0 /1,026.7 /8.4 /YES /11,261.3 / /1 /PERMANENT /LOCAL /AUTO

    Jonathan Lewis wrote:
    user1340127 wrote:
    However, when I tried to shrink the data file/space even to half of its current size, both from command line(resize) and Enterprise Manager, i got the error that says "Failed to commit: ORA-03297: file contains used data beyond requested RESIZE value ". I was able to resize my TEMP tablespace successfully with the same command.
    Any insight on this? thanks a lot. I'm about to run out my hard drive space.
    You have an object stored at the end of the datafile, you will need to move it down the file (or to another tablespace) before you can shrink the file. I though OEM had a "tablespace map" feature to help you see what objects were located where but if not, see:In EM (or 10g dbconsole, at any rate), you select a tablespace, drop down menu Show Tablespace Contents, then there is a tablespace map icon to expand the map. It can be slow. Then you can scroll down to the bottom, scroll up and look for where the last non-green extents are. Also, you can find the beginning of datafiles (if you have multiple data files in the tablespace) by watching for purple header blocks. Hovering over the segments gives information, and clicking on them (or selecting segments in the contents list up above) shows where all the extents are in yellow. I haven't tried the reorganize link...

  • Reading Large data files from client machine ( urgent)

    Hi,
    I want to know the best way for reading large data at about 75MB file from client machine and insert to the database.
    Can anybody provide sample code for it.
    Loading the file should be done at client machine and inserting into the database should be done at server side.
    How should i load the file?
    How should i transfer this file or data to server ?
    How should i insert into the database ?
    Thanks in advance.
    regards
    Kalyan

    Like I said before you should be using your application server to serve files >from the server off the filesystem. The database should not store files this big >and should instead just have a reference to this file. I think u have not understood the problem corectly.
    I will make it clear.
    The requirement is as follows.
    This is a j2ee based application.
    Application server is oracle application server.
    Database is oracle9i
    it is thick client (swing based application)
    User enters datasource like c:\turkey.data
    This turkey.data file contains data
    1@1@20050131@1@4306286000113@D00@32000002005511069941@@P@10@0@1@0@0@0@DK@70059420@4330654016574@1@51881100@51881100@@99@D@40235@0@0@1@430441800000000@@11@D@42389@20050201@28483@15@@@[email protected]@@20050208@20050307@0@@@@@@@@@0@@0@0@0@430443400000800@0@0@@0@@@29@0@@@EUR
    like wise we may have more than 3 lacs rows in it.
    We need to read this file and transfer this to the application server. Which are EJBS.
    There we read this file each row in file is one row in the database for a table.
    Like wise we need to insert 3 lacs records in the database.
    We can use Jdbc to insert the data which is not a problem.
    Only problem is how to transfer this data to server.
    I can do it in one way. This is only a example
    I can read all the data in StringBuffer and pass to server.
    There again i get the data from StringBuffer and insert into database using jdbc.
    This way if u do it. It is performance issue and takes long time to insert into the database.It even may give MemoryOutofBond exception.
    just iam looking for the better way of doing this which may get good performace issue.
    Hope u have understood the problem.

  • SQL Loader and foreign characters in the data file problem

    Hello,
    I have run into an issue which I can't find an answer for. When I run SQL Loader, one of my control files is used to get file content (LOBFILE) and one of the fields in the data file has a path to that file. The control file looks like:
    LOAD DATA
    INFILE 'PLACE_HOLDER.dat'
    INTO TABLE iceberg.rpt_document_core APPEND
    FIELDS TERMINATED BY ','
    doc_core_id "iceberg.seq_rpt_document_core.nextval",
    -- created_date POSITION(1) date "yyyy-mm-dd:hh24:mi:ss",
    created_date date "yyyy-mm-dd:hh24:mi:ss",
    document_size,
    hash,
    body_format,
    is_generic_doc,
    is_legacy_doc,
    external_filename FILLER char(275) ENCLOSED by '"',
    body LOBFILE(external_filename) terminated by EOF
    A sample data file looks like:
    0,2012-10-22:10:09:35,21,BB51344DD2127002118E286A197ECD4A,text,N,N,"E:\tmp\misc_files\index_testers\foreign\شیمیایی.txt"
    0,2012-10-22:10:09:35,17,CF85BE76B1E20704180534E19D363CF8,text,N,N,"E:\tmp\misc_files\index_testers\foreign\ลอบวางระเบิด.txt"
    0,2012-10-22:10:09:35,23552,47DB382558D69F170227AA18179FD0F0,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\leesburgis_á_ñ_é_í_ó_ú_¿_¡_ü_99.doc"
    0,2012-10-22:10:09:35,17,83FCA0377445B60CE422DE8994900A79,binary,N,N,"E:\tmp\misc_files\index_testers\foreign\làm thế nào bạn làm ngày hôm nay"
    The problem is that whan I run this, SQL Loader throws an error that it can't find the file. It appears that it can't interpret the foreign characters in a way that allows it to find that path. I have tried adding a CHARACTERSET (using AL32UTF8 or UTF8) value in the control file but that only has some success with Western languages, not the ones listed above. Also, there is no set of defined languages that could be found in the data file. It essentaially could be any language.
    Does anyone know if there is a way to somehow get SQL Loader to "understand" the file system paths when a folder and/or file name could be in some other langauge?
    Thanks for any thoughts - Peter

    Thanks for the reply Harry. If I try to open the file in various text editors like Wordpad, Notepad, GVIM, andTextpad, they all display the foreign characters differently. Only Notepad comes close to displaying the characters properly. I have a C# app that will read the file and display the contents and it renders it fine. If you look at the directory of files in Windows Explorer, they all are displayed properly. So it seems things like .Net and Windows have some mechanism to understand the characters in order to render them properly. Other applications, again like Wordpad, do not know how to render them properly. It would seem that whatever SQL Loader is using to "read" the data files also is not rendering the characters properly which prevents it from finding the directory path to the file. If I add "CHARACTERSET AL32UTF8" in the control file, all is fine when dealing with Western langauges (ex, German, Spanish) but not for the Eastern languages (ex. Thai, Chinese). So .... telling SQL Loader to use a characterset seems to work, but not in all cases. The AL32UTF8 is the characterset that the Oracle database was created with. I have not had any luck if I try to set the CHARACTERSET to whatever the Thai character set is, for example. There problem there though is that even if that did work, I can't target specific lagauages because the data could come from anywhere. It's like I need some sort of global "super set" characterset to use. It seems like the CHARACTERSET is the right track to follow but I am not sure, and even if it is, is there a way to handle all languages.
    Thanks - Peter

  • Suggested data file size for Oracle 11

    Hi all,
    Creating a new system (SolMan 7.1) on AIX 6.1 running Oracle 11. 
    I have 4 logical volumes for data sized at 100gb each.  During the installation I'm being asked to input the size for the data files. The default is "2000mb/2gb" is this acceptable for a system sized like mine, or should I double them to 4gb each? I know the max is 32gb per data file but that seems a bit large to me.  Just wanted to know if there was a standard best practice for this, or a formula to use based on system sizing.
    I was not able to find any quick suggestions in the Best Practices guide on this unfortunately...
    Any help would be greatly appreciated.
    Thanks!

    Ben Daniels wrote:
    Hi all,
    >
    > Creating a new system (SolMan 7.1) on AIX 6.1 running Oracle 11. 
    >
    > I have 4 logical volumes for data sized at 100gb each.  During the installation I'm being asked to input the size for the data files. The default is "2000mb/2gb" is this acceptable for a system sized like mine, or should I double them to 4gb each? I know the max is 32gb per data file but that seems a bit large to me.  Just wanted to know if there was a standard best practice for this, or a formula to use based on system sizing.
    >
    > I was not able to find any quick suggestions in the Best Practices guide on this unfortunately...
    >
    > Any help would be greatly appreciated.
    >
    > Thanks!
    Hi Ben,
    Check the note 129439 - Maximum file sizes with Oracle
    Best regards,
    Orkun Gedik

  • Limitation on data file size for Oracle 8i on window 2000

    What is the size limitation for each Oracle data file ?
    Oracle 8i
    Window 2000 server (32-bit)

    Hi,
    You can get details from the Doc itself
    Refer : http://www.taom.ru/docs/oradoc.817/server.817/a76961/ch43.htm#11789 (Oracle8i Reference Release 2 (8.1.6) )
    Check 10g also : http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14237/limits002.htm (10g Release 2 (10.2) )
    - Pavan Kumar N

  • SQLPLUS spool to create .dat files - problems

    I am using spool from sqlplus to generate
    .dat files for sqlldr. The problem is with
    numeric fields. I have SET numwidth, numformat to no avail. The fields defined with specific numeric size number(n) always get spooled with a longer column size.
    Also, the COLSEP character does not appear
    after all columns. Any ideas out there ?
    (I realize export/import is an easier solution )

    Try using TO_CHAR, LPAD, SUBSTR and other character functions to achieve the same. Also, if you have not SET HEAD OFF, then do that, or use alias for the formatted columns in case you need headings.

  • What are the real data files of an Oracle installation?

    When I install an Oracle (XE) database then a lot of stuff is copied/setup:
    *.exe, jdbc driver, service tools, scripts, config files,....
    But which are the real data files (under WinXP)?
    I mean the files where all the individual, personal table defintions, trigger code, inserted data records are stored?
    As far as I heard these are only 9 files.
    Is this true?
    What, if I copy just these 9 files to another Oracle XE installation on another computer and replace the 9 files over there?
    Is all the content available at the remote computer after restart?
    Peter

    user559463 wrote:
    When I install an Oracle (XE) database then a lot of stuff is copied/setup:
    *.exe, jdbc driver, service tools, scripts, config files,....
    But which are the real data files (under WinXP)?
    I mean the files where all the individual, personal table defintions, trigger code, inserted data records are stored?
    As far as I heard these are only 9 files.
    Is this true?
    What, if I copy just these 9 files to another Oracle XE installation on another computer and replace the 9 files over there?No. Oracle is far more complex than that. The control file keeps info about the data files, including stuff like the last checkpoint scn, etc. If you just drop a new file on a different installation, even if the file has the same name and is in the same path, it won't match the info in the control file.
    That's not to say there aren't ways to move a data file from one installation to another. But it is a lot more complex than simply dragging a file from one folder to another.
    >
    Is all the content available at the remote computer after restart?
    Peter

  • Data Files for new Oracle 11.2g install

    Hi,
    I have installed Oracle 11.2g on Windows 7 Professional 32 bit environment. I would like to download data files to run queries. Could you tell me where I can download data files/database files/scripts?
    Could you also tell me which Oracle manual can be used to practice Oracle database for Systems Admin.
    Thanks for your help.

    Fallow complete oracle documentation Library.
    Oracle Database Online Documentation 11g Release 2 (11.2)
    Thanks,
    <moderator edit  - delelted link signature - see FAQ on top right>

  • Allocating enough memory to open a large data file

    Hello,
    I am currently running an experiment which requires me to collect data at a very high rate, creating enormous data files. At first, I ran into the problem of LabVIEW not having enough memory to save all of the data to the hard-disk at once (it seeme to have no problem with a huge array during data collection, only when it went to save it to a file), so I programmed LabVIEW to only save 100000 samples at a time. This seemed to work fine, and I was able to collect a 550MB data file in this way. However, now that I would like to analyze that data, LabVIEW can't read the data from the file into an array, giving me the same insufficient memory errors as before. My system has 3GB of memory, so in theory LabVIEW should be able to get enough to open it, however Windows doesn't seem to want to allocate what it needs. Is there any way that I can override windows and allocate enough memory to LabVIEW for it to be able to open this file and work on the data?

    BrockB wrote:
    The data is all in a 2xN tab-delimited array, and I'm using the 1d array output of the "read from spreadsheet" vi, as I only need the first of the two columns. What I meant to say earlier was that I would still have to read all of the data from the file first if I wanted to break it up into pieces to use later. Labview seems to get stuck on reading the data from the file, not actually on opening it. It also seems like breaking the data up would be a much bigger hassle than just allowing labview to use more of the 3GB that I have available (most of which is sitting unused anyway).
    First of all, any ASCII formatted file is an extreme waste of space and comparatively expensive to write and read. It is only recommeded for small files intended to be later read by humans.
    "Read from Spreadsheet file" must first internally read the entire file as a string, then chop it up looking for tab and linefeed characters, then scan the parts into DBL. If you want the first column (instead of the first row), it then also needs to transpose the entire 2D array, requiering another copy in memory. As you can see, you'll easily have quite a few data copies plugging up your RAM. ("Read/Write from/to spreadsheet file" are plain VIs. You can open their diagram to see what's in there. )
    For datasets this size, you MUST use binary files. Try it and your problems will go away.
    Message Edited by altenbach on 11-11-2008 12:06 PM
    LabVIEW Champion . Do more with less code and in less time .

  • Large data update Problem

    I have a temprary table with 5 million rows (A)
    which needs to be appended with 90 million row table (B).
    60% of the rows of the 5mil rows already exist in the big table
    i need to update/merge the table A data with table B
    Oracle version is 8.1.7
    Please advice which method is the fastest

    hi raghu,
    this is the portal content management forum. please post your database related question in the following forum:
    General Database Discussions
    this is the appropriate place to post database related questions.
    thanks,
    christian

  • Simple Problems of oracle 8i using Rman and so.................

    I am Shan here. I am DBA student for oracle 8i. I have a problem with a backup and recoverys article which is Rman. If you can not help for this then please tell me a contact of that person who can help me. If u can then it is on..
    1- I create a user RMAN. I grant him a quota on rman and on rman_temp tablespace; grant him a dba, sysdba, recovery_catalog_owner role. Then I connect it to target. Then I create catalog, then register database then type a command of list (LIST COPY OF DATABASE and LIST BACKUP OF DATABSE) it shows me nothing. ok.
    Then I take a whole backup of a database with the command
    Run {
    Allocate channel c1 type disk;
    Backup
    Format d:\backup\whole_%d_%s_%p.bak
    (Database) ;}
    This command shows me allocating channel and then processing backup Then this commend ended successfully. And it shows me files physically of my desire path. Then I query a command of LIST for showing me my backup details (LIST BACKUP OF DATABSE). But it shows me nothing. No details about my backup. But the file is still physically there at my given path.
    This is the same story with copy command it physically create a file there but do not show it with LIST COPY OF DATABASE command.
    It is not a story of a single Computer I changed it and do the same stuff with the others but the result is same. But I got desire results on one Computer but next day I got another problem with this Computer. I type a command of list backup it shows me the results. It has two backup pieces. I tried to delete them with change command by allocating channel for delete type disk. And then change backupset 605(backup set no) delete;. It shows me that he deleted that file form path but when I query of list backup it show me the same results which it show me before delete. Then I tried to delete second set which is still there I would not allow me to delete that set and give a error that catalog does not have the information about that file but list command show me both pieces. I do manual resync catalog command but no difference on results.
    Then I take another whole backup at another place, and query to get its information but list command does not show the results. It shows me the old results. Then I take a backup of a single data file and query but the result is same.
    I also tried to delete that copy data file by change command but it says that recovery catalog do not have the information about that file.
    2- If we allocate more then one channels for backup on a same drive what will be the behavior of those channels.
    Will them not working parallel. And tell me that it is true that channels will work parallel when they are writhing information on different drives (different drive means that two different hard drives physically).
    Syntax of all commands is writen,I tallied it by book.<<<<<<<<<<<<<<<<3- What is O stands for in crosscheck command? A stands for available U Stands for Unavailable and O Stands for? In a status column.
    4- Please tell me a single command to delete all tables of a user. (Single command..)
    We are using ORACLE 8i (8.1.7.0.0) Version for practice.
    Give its solution by mail
    [email protected]

    I have briefly used sql*ldr in linux and yes, it does support exit codes

  • Large data moving from one schema to another schema and tablespace

    Dear DBA's
    Oracle 11.2.0.1
    OS : Solaris 10.
    we have 1 TB data in A schema, i want to move to this in B schema for other testing purpose. which method is good to take export/import of this large data? Kindly advice.
    Thanks and Regards
    SG

    Hi
    You can use expdp-impdp or Transportable Tablespaces. Pelase check below note:
    Using Transportable Tablespaces for EBS Release 12 Using Database 11gR2 [ID 1311487.1]
    Regard
    Helios

  • JDBC Problem with Oracle 8.1.7. and OC4J  Version 2

    I use the following connection statement with Oracle 8.1.7. and JDeveloper 3 and Apache Server. It works fine.
    String user = "scott/tiger";
    DriverManager.registerDriver(new oracle.jdbc.driver.OracleDriver());
    Connection conn = DriverManager.getConnection("jdbc:oracle:oci8:"+user+"@RDL");
    The same program does not obtain the connection with JDeveloper 9i, OC4J Version 2 and Oracle 8.1.7. I do not receive any error message, no exception is thrown.
    Please help me.

    Hi Avi,
    I had already tried the thin Diriver but after yout help I try again.
    With this thin driver, I received a SQLException which helps me.
    As usual, there were a lot of different problems. The main one was that my default configuration do not use the jdbc/lib817 main jdbc/lib. There is a classe12.jar in these libs that are apparently different.
    Finally, I obtain the connection with the "thin" driver.
    Thank you very much Avi!
    Reni

Maybe you are looking for

  • System Restore from Time Machine Back up and Mail

    Hi I have around 16,000 Messages in my Mail 3.2 Inbox. I use Gmail IMAP service. One thing I like about Mail is that next to all my messages there are these arrows that indicate whether I have replied to or forwarded a particular message and clicking

  • Wake up my computer at home ?

    Hello everyone, In office I can use my iMac to do the screen sharing with my Mac Pro at home, however, it will loss the connection if the Mac Pro goes to the sleep mode. Is there a way that I can wake my Mac Pro from my office's iMac, if so, how to d

  • How do I find the Adobe Acrobat Pro Trial that I have downloaded?

    I have downloaded the adobe acrobat pro trial, but I cannot find it among my downloads. I  do see the Adobe download assistant only.

  • I am unable to sign in to apple tv.

    I am unable to sign in to apple tv. I am using the correct ID and password for sure because it's the same one that I needed to register for this forum as well as to access iTunes. All which of which I have been allowed to do. So why does my Apple TV

  • Canon 5600f scanner doesn't work with Snow Leopard

    Please someone help - I made the mistake of upgrading to snow leopard. My new Canon 5600f scanner now does not work. Pop up says it has wrong driver. OK.. Downloaded EVERY Canon driver possible + the updated Navigator EX 2.0 software. Note I HAVE dow