How to avoid sleepycat.je.log.ChecksumException

Hello,
We're building one on-line game application, and are evaluating JE. We want to use JE as a persistent store (very similar to a cache, periodically sync each on-line user's data to a remote JE server via network socket). We're testing with thousands of on-line users, and the data size for each user is about 83 KB.
I ran into one EnvironmentFailureException exception while doing anomaly testing. I used "kill -TERM pid" to kill the JVM process, and encountered the following exception. I had to clean all the JE log files in order to re-start JE. I'm wondering:
1. Is there anything wrong with my approach of anomaly testing? How to avoid the exception if I want to continue my anomaly testing?
2. Once the exception happens, are there any ways of "rescuing" the existing data? Because I had to clean all the JE log files in order to re-start JE.
The error message is as follows:
<daemonthread name="Cleaner-1"></daemonthread> caught exception: com.sleepycat.je.EnvironmentFailureException: (JE 4.1.6)DBServ (1):dbhome com.sleepycat.je.log.ChecksumExcep Read invalid log entry type: 0 LOG_CHECKSUM: Checksum invalid on read, log is likely invalid. Environment is invalid and must be closed. fetchTarget of 0x8a/0x59aaac parent IN=59 IN class=com.sleepycat.je.tree.BIN lastFullVersion=0xa4/0x7d1362 parent.getDirty()=true state=0
com.sleepycat.je.EnvironmentFailureException: (JE 4.1.6)DBServ(1):dbhome leepycat.je.log.ChecksumException: Read invalid log entry type: 0 LOG_CHECKSUM: Checksum invalid on read, log is likely invalid. Environment is invalid and must be closed. fetchTarget of 0x8a/0x59aaac parent IN=59 IN class=com.sleepycat.je.tree.BIN lastFullVersion=0xa4/0x7d1362 parent.getDirty()=true state=0
at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:784)
at com.sleepycat.je.log.LogManager.getLogEntryAllowInvisibleAtRecovery(LogManager.java:742)
at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1315)
at com.sleepycat.je.tree.BIN.fetchTarget(BIN.java:1367)
at com.sleepycat.je.tree.Tree.getParentBINForChildLN(Tree.java:1017)
at com.sleepycat.je.cleaner.FileProcessor.processLN(FileProcessor.java:678)
at com.sleepycat.je.cleaner.FileProcessor.processFile(FileProcessor.java:553)
at com.sleepycat.je.cleaner.FileProcessor.doClean(FileProcessor.java:241)
at com.sleepycat.je.cleaner.FileProcessor.onWakeup(FileProcessor.java:143)
at com.sleepycat.je.utilint.DaemonThread.run(DaemonThread.java:162)
at java.lang.Thread.run(Thread.java:619)
Caused by: com.sleepycat.je.log.ChecksumException: Read invalid log entry type: 0
at com.sleepycat.je.log.LogEntryHeader.<init></init>(LogEntryHeader.java:138)
at com.sleepycat.je.log.LogManager.getLogEntryFromLogSource(LogManager.java:861)
at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:781)
Thanks,
Leeuan

Do you still have the offending log files? If so, save them away.
With a fresh environment, are you able to reliably reproduce this? i.e. if you kill -9 on a running system, can you reproduce the same problem?
Charles Lamb

Similar Messages

  • How to avoid Cost error log  while confirming production order

    Hi
    I dont want to post actual activity cost via production order activity confirmation. But i want standard value keys for my production duration purpose. So,i defined activites(strd value key) in work center without assigning cost center to that work center. While i confirm system throws erro log as Actual cost calculation contain errors and allows me to confirm the activities. I am doing MB31 and all CO settlement activities also. But when i try to close the order it says error log exists,so closing of order is not possible. How to overcome this problem as i dont want to capture any cost of activites via production order,but i want confirmation only for production analysis.

    Sudhar,
    You can make the operation not relevant for Costing by using a Customized Control key which is not relevant for Costing or else in the operation details you need to leave the "Costing relevancy" field blank.
    Regards,
    Prasobh

  • Oracle script:How to avoid output in log file?

    Hi,
    I am try to create the xml file using some sql statements in a script.
    I dont want to display the output from these sql statements in log file.
    I tried with SET TERMOUT OFF, even then i am getting the output.

    I am getting the output even after using the SET echo off.
    I have given the following SET options,
         SET heading off
         SET feedback off
         SET verify off
         SET wrap off
         SET pagesize 0
         SET linesize 3000
    SET server off
         SET echo off
    I do not want to display the output in log file when i run the concurrent request.

  • How to avoid creation of _(file.*), coping from tiger to pc?

    how to avoid the creation of this file for ever on a pc environment??
    I explain: as i copy any image file, it goes to a spooler directory and that begins instantly to process the file to an OPI server, separating CMYK channels, converting it to high and low resolution...
    but as this _(file) is created (same name and extension, but 0Kb and with this _ underscore in the beggining of it name), it begins to process it, and as it really does not exists as an image itself, it crashes the OPI server, forcing us to restart it...
    and then, you can imagine the mess, because we're a newspaper, with more than 100 machines logged in two cities simultaneously...
    thanks for those who replied last topic I posted about some softwares to delete it! it really worked!!
    but now, can anyone have a clue how to DON'T create this file?
    thanks in advance!
    alex borba
    [email protected]

    I believe that command is supposed to stop the creation .DS_Store files on servers. Also, I think the plist file gets written to the user ~/Library/Pref, it probably ought to exist both there and in the /Library/Pref folder to really work.
    I didn't think the Save for Web option would work for your purposes. I don't believe anything will work except to strip the resource fork off the files before you send them. There is an Applescript that will do it, using a UNIX command in the script. You select a folder, run the Applescript, and it strips the resource fork off all files in the folder and creates a sub-folder called Stripped with resourceless copies of the files in the original:
    -- Strip resource fork and metadata in Tiger for one file.
    -- If a folder is selected creates an unresourced subfolder of the selected folder
    tell application "Finder"
         try
              set aFile to the selection as alias
              set aFolder to the container of aFile
         on error
              display dialog "Select a file"
              return
         end try
         set aFile to the quoted form of POSIX path of aFile
         do shell script "rsync -a " & aFile & " " & aFile & "Stripped; mv " & aFile & "Stripped " & aFile
         update aFolder
    end tell
    The "do shell script" line must be all a single line in Script Editor. It might wrap funny in the browser. Anyway, copy it, paste into Script Editor, hit compile, and save it as an application. I've only fooled with it a little tiny bit, so I'm not sure whether the rsync process would have any strange effects on complex Photoshop and EPS files or not.
    As far as I know the only way to send Mac files without resource forks to a server is to strip them from the files.
    Francine

  • How to avoid db file parallel read for nestloop?

    After upgraded to 11gr2, one job took more than twice as long as before on 10g and 11gr1 with compatibility being 10.2.0.
    Same hardware. (See AWR summary below). My analysis points to that Nestloop is doing index range scan for the inner table's index segment,
    and then use db file parallel read to read data from the table segment, and for reasons that I don't know, the parallel read is very slow.
    AVG wait is more than 300ms. How can I fluence optimier to choose db file sequential read to fetch data block from inner table by tweaking
    parameters? Thanks. YD
    Begin Snap: 13126 04-Mar-10 04:00:44 60 3.9
    End Snap: 13127 04-Mar-10 05:00:01 60 2.8
    Elapsed: 59.27 (mins)
    DB Time: 916.63 (mins)
    Report Summary
    Cache Sizes
    Begin End
    Buffer Cache: 4,112M 4,112M Std Block Size: 8K
    Shared Pool Size: 336M 336M Log Buffer: 37,808K
    Load Profile
    Per Second Per Transaction Per Exec Per Call
    DB Time(s): 15.5 13.1 0.01 0.01
    DB CPU(s): 3.8 3.2 0.00 0.00
    Redo size: 153,976.4 130,664.3
    Logical reads: 17,019.5 14,442.7
    Block changes: 848.6 720.1
    Physical reads: 4,149.0 3,520.9
    Physical writes: 16.0 13.6
    User calls: 1,544.7 1,310.9
    Parses: 386.2 327.7
    Hard parses: 0.1 0.1
    W/A MB processed: 1.8 1.5
    Logons: 0.0 0.0
    Executes: 1,110.9 942.7
    Rollbacks: 0.2 0.2
    Transactions: 1.2
    Instance Efficiency Percentages (Target 100%)
    Buffer Nowait %: 99.99 Redo NoWait %: 100.00
    Buffer Hit %: 75.62 In-memory Sort %: 100.00
    Library Hit %: 99.99 Soft Parse %: 99.96
    Execute to Parse %: 65.24 Latch Hit %: 99.95
    Parse CPU to Parse Elapsd %: 91.15 % Non-Parse CPU: 99.10
    Shared Pool Statistics
    Begin End
    Memory Usage %: 75.23 74.94
    % SQL with executions>1: 67.02 67.85
    % Memory for SQL w/exec>1: 71.13 72.64
    Top 5 Timed Foreground Events
    Event Waits Time(s) Avg wait (ms) % DB time Wait Class
    db file parallel read 106,008 34,368 324 62.49 User I/O
    DB CPU 13,558 24.65
    db file sequential read 1,474,891 9,468 6 17.21 User I/O
    log file sync 3,751 22 6 0.04 Commit
    SQL*Net message to client 4,170,572 18 0 0.03 Network

    Its not possible to say anything just by looking at the events.You must understand that statspacks and AWR actualy aggergate the data and than show the results.There may be a very well possibility that some other areas also need to be looked at rather than just focussin on one event.
    You have not mentioned any kind of other info about the wait event like their timings and all that.PLease provide that too.
    And if I understood your question corretly,you said,
    How to avoid these wait events?
    What may be the cause?
    I am afraid that its not possible to discuss each of these wait event here in complete details and also not about what to do when you see them.Please read teh Performance Tuning book which narrates these wait events and corresponding actions.
    Please read and follow this link,
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/instance_tune.htm#i18202
    Aman....

  • Avoid creation of log file for external table

    Hi
    This script is creating log file in the ext directory. How to avoid it. Can you specify the syntex.
    Thanks alot.
    Bhaskar
    CREATE TABLE datfiles_list
    (file_name varchar2(255))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ext_dir
    ACCESS PARAMETERS (RECORDS DELIMITED BY NEWLINE)
    LOCATION ('datfiles_list.txt')
    );

    Example
    CREATE TABLE datfiles_list
    (file_name varchar2(255))
    ORGANIZATION EXTERNAL
    (TYPE ORACLE_LOADER
    DEFAULT DIRECTORY ext_dir
    ACCESS PARAMETERS (RECORDS DELIMITED BY NEWLINE NOLOGFILE)
    LOCATION ('datfiles_list.txt')
    );

  • How to avoid database logon dialog on every action of refreshing reports

    I am using Crystal reports 208 SP2 and has a database connection to oracle database. Connection is esablished and report retrieves desired data, The problem is when ever i open this report ( which has this connection set) and do a refresh,  the report the LogOn dialog pops up.
    How to avoid or save this log on info such that it does not pops up the LogOn dialog when i open the report and do a refresh(F5).?

    Hi Don,
    Thanks for the reply. To try  your suggested approach, i need some clarifications .
    _Problem summary:for refrence, _
    I am following the steps mentioned below to establish a database connection, and fetch the data.
    1) I am using Oracle drivers for configuring the ODBC data source. As part of configuration i will specify the Oracle server name
    2) In crystal reports, Using "Database" menu, i open "Database Expert" dialog and "Make New connection" by specifing the data source mentioned in step 1). During this process, (ODBC)logon information will be filled in. The user ID and password used has all the rights to access the tables in the database. Logon will be successful and connection is established.
    3) Further i use the database table from the "Database Expert" on to the report and refresh(F5) to fetch the data.
    After step 3, I close the report  and reopen it. Now trying to refresh the database logon dialog pops up. I need to avoid this pop up
    Requesting Clarification:;
    1) You have specified "When you connect you have the option to use Trusted Authentication, check this option on". Where is this option. Is it in the crystal reports or is it available during data source configuration. Kindly specify.
    2) Here "Enterprise logon info that has been configured on the Oracle server." implies the oracle database login information right? If yes, this login details are available and i am using this logon information(i.e. UseriD and Password) during database connection.
    If No, Is "Enterprise logon info " is something which should be specifically enabled by DBA in oracle11g configuraion.
    Kindly suggest.
    Edited by: R Guru on Oct 17, 2009 7:32 PM

  • How to avoid output file and error lines when condition does not match

    Hi Experts
    A customer want to send purchase order files to several vendors and each vendor has their own requirement to content and file format. I have prepared scenarios for each vendor and a template is (conditionally) called when CardCode matches. It works fine, - PO file is generated when CardCode matches, but error lines are written to SAP B1 Control Center each time CardCode does not match the one defined in a scenario (100 or more times per day). Problem is that these error lines does not get automatic deleted. Conditional processing are made in this way:
    Result Message in SAP B1 Control Center when CardCode does not match:
    I tried to avoid an error line in SAP B1 Control Center when CardCode does not match in this way:
    Then no error lines are written to SAP B1 Control Center, but an output file is generated each time (order file when CardCode matches and an empty file when CardCode does not match).
    Any suggestions on how to avoid both errors types (error lines in SAP B1 Control Center and empty output files)?
    Or - can lines in SAP B1 Control Center be automatic deleted?
    Regards
    Steen

    Hi Steen,
    using <b1im_skip> should be the right approach, if you're working with B1 OUTBOUND.
    Following extract form the B1i help describes the usage to skip the outbound processing:
    1.1 Skipping Outbound Processing
    If you do not want to hand over the message to the receiver system, the scenario step can create a special tag that indicates to generic processing in the integration framework to skip the message processing.
    Add the following to the final transformation atom (atom0):
    <b1im_skip xmlns= ”” info=”my info” msglog=”true” msgout="yes">
    If the integration framework skips the message, it puts the message log information to the Filtered section, if the message log is switched on. The result message contains the Message skipped by vBIU logic information.
    info
    To display an individual message, define it using the info attribute.
    msglog
    If you want to avoid an entry in the message log, introduce the msglog attribute and set it to false.
    msgout
    To display the skip information in the Success section of the message log, use the msgout attribute and set it to yes.
    In case you don't want any MsgLogEntry in case of a skipped message, please enter the following in final atom0:
    <xsl:template name="transform">
    <xsl:attribute name="pltype">xml</xsl:attribute>
    <xsl:choose>
    <xsl:when test="$msg/BOM/BO/Documents/row/CardCode=&apos;C20000&apos;">
    <xsl:call-template name="transform2"/>
    </xsl:when>
    <xsl:otherwise>
    <b1im_skip info="skipped" msglog="false"/>
    </xsl:otherwise>
    </xsl:choose>
    </xsl:template>
    Best regards
    Bastian

  • How to avoid Time out issues in Datapump?

    Hi All,
    Iam loading one of our schema from stage to test server using datapump expdp and impdp.Its size is around 332GB.
    My Oracle server instance is on unix server rwlq52l1 and iam connecting to oracle from my client instance(rwxq04l1).
    iam running the expdp and impdp command from Oracle client using the below commands.
    expdp pa_venky/********@qdssih30 schemas=EVPO directory=PA_IMPORT_DUMP dumpfile=EVPO_Test.dmp CONTENT=all include=table
    impdp pa_venky/********@qdsrih30 schemas=EVPO directory=PA_IMPORT_DUMP dumpfile=EVPO_Test.dmp CONTENT=all include=table table_exists_action=replace
    Here export is completed and import is struck at below index building.After some time iam seeing below time out in log files
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX.
    Error:-
    VERSION INFORMATION:
    TNS for Linux: Version 11.1.0.7.0 - Production
    Unix Domain Socket IPC NT Protocol Adaptor for Linux: Version 11.1.0.7.0 - Production
    Oracle Bequeath NT Protocol Adapter for Linux: Version 11.1.0.7.0 - Production
    TCP/IP NT Protocol Adapter for Linux: Version 11.1.0.7.0 - Production
    Time: 13-JAN-2012 12:34:31
    Tracing not turned on.
    Tns error struct:
    ns main err code: 12535
    TNS-12535: TNS:operation timed out
    ns secondary err code: 12560
    nt main err code: 505
    TNS-00505: Operation timed out
    nt secondary err code: 110
    nt OS err code: 0
    Client address: (ADDRESS=(PROTOCOL=tcp)(HOST=170.217.82.86)(PORT=65069))
    The above ip address is my unix client system(rwxq04l1) ip.
    How to see oracle client system port number?
    Please suggest me how to avoid this time out issues.Seems this time out is between oracle server and client.
    Thanks,
    Venkat Vadlamudi.

    Don't run from the client ... run from the server
    or
    if running from a client use the built-in DBMS_DATAPUMP package's API.
    http://www.morganslibrary.org/reference/pkgs/dbms_datapump.html

  • How to avoid password prompt in shell script for zip password protection

    Hi
    I am trying to set password protection to my oracle database export backup. Once the backup completed, it should compress with a password protection. Thats the plan. Initialy we were using the gzip for the compression. Then realized that there is no password protection for the gzip. Started using zip option. I tried using
    zip -P <password> filename
    But it was throwing below error.
    -bash-3.2$ zip -P expreports REPORTS_2013FEB14.dmp
    zip warning: missing end signature--probably not a zip file (did you
    zip warning: remember to use binary mode when you transferred it?)
    zip warning: (if you are trying to read a damaged archive try -F)
    zip error: Zip file structure invalid (REPORTS_2013FEB14.dmp)
    Not quite sure why.
    Then I used zip -e REPORTS_2013FEB14.dmp.zip REPORTS_2013FEB14.dmp
    But this prompting for the password. As I am trying to put the command in the script. It will be tough if it prompts for the password.
    I would like to know how to avoid the password prompting by saving somewhere or how the code should be written. Tried using expect feature of shell script. Below was the code I tried. It didnt work.
    [oracle@SF40V6636 test]$ cat repexp.sh
    zip -e REPORTS_imp.log.zip REPORTS_imp.log
    expect "Enter password:"
    send "imprep"
    expect "Verify password:"
    send "imprep"
    So please help in avoiding this password prompt or let me know how to change the code.
    Thanks
    SHIYAS M

    How about using gpg and adding a secret key to the requirement of a password? No one should be able to decrypt your file, not by knowing only the password.
    1. Generate a public and private key pair:
    $ gpg --gen-key
    When it shows "We need to generate a lot of random bytes…" open another terminal session and type "dd if=/dev/sda of=/dev/null" to create traffic. When the public and secret key created and signed you can Ctrl-C the dd command.
    To see what you have created:
    $ gpg --list-keys
    2. Encrypt and gzip your stuff:
    $ tar zcf stuff.tgz file_or_folder
    $ gpg recipient "Some Name" encrypt stuff.tgz
    $ rm -f stuff.tgz
    3. Decrypt and extract the archive:
    $ gpg batch yes --passphrase "password" -d stuff.tgz.gpg > stuff.tgz
    $ tar zxvf stuff.tgz
    Again, knowing the password alone will not let anybody decrypt your stuff.

  • How the Payload Message and Logs are stored in the B1i Database Table: BZSTDOC

    I would appreciate it if someone could provide any documentation regarding B1i database further maintenance.
    for example:
    I want to know how the payload message and logs are stored in the table BZSTDOC, and how can we retrieve the payload message directly from the column DOCDATA.
    As described in the B1iSNGuide05 3.2 LogGarbageCollection:
    to avoid the overload of the B1i Database, I set the Backup Buffer to 90 days : so this means Message Logs from the last 90 days will always be available, but is there some way we can save those old messages to a disk so that I can retrieve the payload message anytime?
    in addition, let’s assume the worst, the B1iSN server or the B1i database damaged, Can we just simply restore the B1i database from a latest backup DB then it can work automatically after the B1iSN server is up and running again?
    BR/Jim

    Dear SAP,
    Two weeks passed, I still haven't received any feedback from you guys.
    Could you please have a look at my question?
    How is this Question going? Is it Untouched/Solving/Reassigned ?

  • How to manage my call logs on att z 10.

    How to manage my call logs on att z 10.   I need to keep my call log for business purposes and they disappeared today.  I may have accidentally deleted them but I need to find out how to save and avoid deleting in the future.

    Deleted call logs cant be restored.
    There is no special way to save it, just be careful not to tap the Clear Log option.
    If my post has helped you, give it a LIKE

  • How to avoid data lost in Dataguard failover ?

    Hi gurus,
    We are implementing Dataguard with manual failover (Not with Fast Start feature)
    How to avoid/minimize lost of data if the primary server goes down before the last log is sent ?
    Thank you for your help,
    xtanto

    Most important the dataguard protection mode, you have 3 choices: Maximum Performance,Maximum Availability,Maximum Protection . If you are interested in no data loss choose Maximum Protection, that's the definition:
    Maximum Protection
    This protection mode ensures that zero data loss occurs if a primary database fails. To
    provide this level of protection, the redo data needed to recover a transaction must be
    written to both the online redo log and to at least one synchronized standby database
    before the transaction commits. To ensure that data loss cannot occur, the primary
    database will shut down, rather than continue processing transactions, if it cannot
    write its redo stream to at least one synchronized standby database.
    Because this data protection mode prioritizes data protection over primary database
    availability, Oracle recommends that a minimum of two standby databases be used to
    protect a primary database that runs in maximum protection mode to prevent a single
    standby database failure from causing the primary database to shut down.
    For details see the documentation for your unknown database version.
    Werner

  • How to avoid messaging sever to check confident domain name

    Hi!
    I have two questions, that do not let me sleep.
    1
    I am proving two diferents servers with messaging server. By default they check in DNS the confidents domains, so they do not acept inboun messages of the other. how can avoid rely configuration so that the servers accepts messages from all domains.
    2
    Trere any way or document to implement several domains whit messaging server schema 2 and delegated administrator, using only one server for all domains.
    I create several domains whit the Delegated Administration, one.com and two.com. The one.com domain work perfectly. In the two.com domain I had created users, templates..... but when i tried to log into webmail of Sun Java System Messenger Express using a user of domain two.com , I cant because I think it is the same instance that domain one.com.
    Thanks.

    Hi!
    I have two questions, that do not let me sleep.
    1
    I am proving two diferents servers with messaging
    server. By default they check in DNS the confidents
    domains,I don't understand what you mean by "confidents domains":
    so they do not acept inboun messages of the
    other. how can avoid rely configuration so that the
    servers accepts messages from all domains.I suspect that what you need to do, is to have each server consider the other server "internal", so they are allowed to relay out.
    Add each server's ip address to the other server's "internal_ip" map in the mappings file, following the same format you see for 127.0.0.1. After making the file change, run:
    imsimta cnbuild
    imsimta restart dispatcher
    >
    2
    Trere any way or document to implement several
    domains whit messaging server schema 2 and delegated
    administrator, using only one server for all
    domains.
    I create several domains whit the Delegated
    Administration, one.com and two.com. The one.com
    domain work perfectly. In the two.com domain I had
    created users, templates..... but when i tried to log
    into webmail of Sun Java System Messenger Express
    using a user of domain two.com , I cant because I
    think it is the same instance that domain one.com.It's not a matter of "instance", but of what domain you used in the url that you're logging into.
    If you log into webmail like this:
    http://webmail.domain1.com
    you can log into domain1 with just your UID, but would need "[email protected]" for domain 2. If you log into domain 2 like this:
    http://webmail.domain2.com
    and your DNS points to the same machine, then you will be able to log into domain2 with just a bare UID
    Actually, this is all covered in the documentation, though I admit, it's hard to find it.
    >
    Thanks.

  • How to avoid the duplicate values, I do not want the duplicate............

    i have one database table called "sms1" that table is updated every day or on daily basis it has the following fields in it:
    SQL> desc sms1;
    Name Null? Type
    MOBILE NUMBER
    RCSTCNATCNATCNATCNAWTHER VARCHAR2(39 CHAR)
    SNO NUMBER
    INDATE DATE
    From this table the is one column "RCSTCNATCNATCNATCNAWTHER VARCHAR2(39 CHAR)" . I am splitting it into different columns like :
    SQL> desc smssplit;
    Name Null? Type
    R VARCHAR2(2 CHAR)
    C VARCHAR2(2 CHAR)
    S VARCHAR2(1 CHAR)
    TC VARCHAR2(3 CHAR)
    NA VARCHAR2(3 CHAR)
    TC2 VARCHAR2(3 CHAR)
    NA2 VARCHAR2(3 CHAR)
    TC3 VARCHAR2(3 CHAR)
    NA3 VARCHAR2(3 CHAR)
    TC4 VARCHAR2(3 CHAR)
    NA4 VARCHAR2(3 CHAR)
    WTHER VARCHAR2(10 CHAR)
    SNO NUMBER
    INSERTDATA VARCHAR2(25 CHAR)
    Now I am written a procedure to insert the data from "Sms1" table to smssplit table...
    CREATE OR REPLACE PROCEDURE SPLITSMS
    AS
    BEGIN
    INSERT INTO scott.SMSSPLIT ( R,C,S,TC,NA,TC2,NA2,TC3,NA3,TC4,NA4,WTHER,SNO)
    SELECT SUBSTR(RCSTCNATCNATCNATCNAWTHER,1,2) R,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,3,2) C,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,5,1) S,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,6,3) TC,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,9,3) NA,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,12,3) TC2,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,15,3) NA2,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,18,3) TC3,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,21,3) NA3,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,24,3) TC4,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,27,3) NA4,
    SUBSTR(RCSTCNATCNATCNATCNAWTHER,30,10) WTHER, SNO
    FROM scott.SMS1 where SNO=(select MAX (sno) from SMS1);
    END;
    Now in order to update the second table with data from first table on regular basis I have written a job scheduler and I am using oracle 9.0. version...
    DECLARE
    X NUMBER;
    JobNumber NUMBER;
    BEGIN
    SYS.DBMS_JOB.SUBMIT
    job => X
    ,what => 'scott.SPLITSMS;'
    ,next_date => SYSDATE+1/1440
    ,interval => 'SYSDATE+1/1440 '
    ,no_parse => FALSE
    :JobNumber := to_char(X);
    END;
    Now this job scheduler is working properly and updating the data for every one minute but it is taking or updating the duplicate values also ..like example:
    R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
    INSERTDATA
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:49:16
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:49:16
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:50:17
    R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
    INSERTDATA
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:50:17
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:51:19
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:51:19
    R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
    INSERTDATA
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:52:20
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:52:20
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:53:22
    R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
    INSERTDATA
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:53:22
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:54:45
    33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
    06-SEP-2012 03:54:45
    Now I do not want the duplicate values to be updated ...and want them to ignore them.....
    please I need a help on this query........How to avoid the duplicate values............

    Look at the posts closely:might not be needed if formatted ;)
    create or replace procedure splitsms as
    begin
      insert into scott.smssplit (r,c,s,tc,na,tc2,na2,tc3,na3,tc4,na4,wther,sno)
      select substr(rcstcnatcnatcnatcnawther,1,2) r,
             substr(rcstcnatcnatcnatcnawther,3,2) c,
             substr(rcstcnatcnatcnatcnawther,5,1) s,
             substr(rcstcnatcnatcnatcnawther,6,3) tc,
             substr(rcstcnatcnatcnatcnawther,9,3) na,
             substr(rcstcnatcnatcnatcnawther,12,3) tc2,
             substr(rcstcnatcnatcnatcnawther,15,3) na2,
             substr(rcstcnatcnatcnatcnawther,18,3) tc3,
             substr(rcstcnatcnatcnatcnawther,21,3) na3,
             substr(rcstcnatcnatcnatcnawther,24,3) tc4,
             substr(rcstcnatcnatcnatcnawther,27,3) na4,
             substr(rcstcnatcnatcnatcnawther,30,10) wther,
             sno
        from scott.sms1 a
       where sno = (select max(sno)
                      from sms1
                     where sno != a.sno
                   ); ---------------> added where clause with table alias.
    end;Regards
    Etbin

Maybe you are looking for

  • Not able to edit service ticket (complaint) in CRM

    Hi, The service ticket (complaint) in CRM has been replicated to R/3 as credit memo request, but the document in CRM is not able to edit. The error message is " Document is being processed, not possible to edit". I checked in SMW01, those tickets are

  • I can't import images, I just get a dashed frame.

    I have been using iPhoto since version 1.0 and have very rarely ran into trouble. But all of the sudden I can't import pictures anylonger! During import I can see the pictures as previews... but as soon as it is done importing I only see a dashed lin

  • Turnaround Time to Implement Business Packages

    Hello, I am looking to find some benchmarking material on what it takes to download and implement with data loaded from backend system a 'out of the box' Business Package.  Any info would be helpful. regards, Tom/

  • I don't want to download itunes 7.2 but I can't buy music!

    Everytime I go to buy a song it takes me to a page that says I should download iTunes 7.2 for free and install iTunes Plus but I don't want to! Can I not download any song without having downloaded iTunes 7.2 or something? I have dial-up, so download

  • How can i update the iTunes in my iPhone 5?

    I just bought an iPhone 5 and i cant conect it with my MacBook, so i need to update the iTunes in mu iPhone, but i don't know how