ORA-06502: PL/SQL: numeric or value error:. number precision too large, ORA-06502: PL/SQL: numeric or value error:. number precision too large

I have hit with above error and the code did run successfully before the minor change in the portion I have bolded. Appreciate your comment & help.
Attached with the code:
CREATE TABLE RPT1120B_CHANNEL_new
(SUBSCRIBER_NO NUMBER (9),
FP_CHANNEL VARCHAR2(255),
FP_DATE DATE,
FP_DYNASTY CHAR(1),
FP_MOVIE CHAR(1),
FP_FUN CHAR(1),
FP_LEARNING CHAR(1),
FP_NEWS CHAR(1),
FP_SPORTS CHAR(1),
FP_VARIETY CHAR(1),
FP_EMPEROR CHAR(1),
FP_SX1 CHAR(1),
FP_CEL CHAR(1),
LP_CHANNEL VARCHAR2(255),
LP_DATE DATE,
LP_DYNASTY CHAR(1),
LP_MOVIE CHAR(1),
LP_FUN CHAR(1),
LP_LEARNING CHAR(1),
LP_NEWS CHAR(1),
LP_SPORTS CHAR(1),
LP_VARIETY CHAR(1),
LP_EMPEROR CHAR(1),
LP_SX1 CHAR(1),
LP_CEL CHAR(1));
--truncate table RPT1120B_CHANNEL;
create or replace PROCEDURE sp_rpt1120b_new
AS
FP_CHANNEL VARCHAR2(255);
FP_DATE DATE;
LP_CHANNEL VARCHAR2(255);
LP_DATE DATE;
REC_COUNT NUMBER(3);
TYPE REC_SA IS RECORD
(AGREEMENT_NO NUMBER (9));
TYPE REC_CHANNEL IS RECORD
(CHANNEL VARCHAR2(3),
AGREEMENT_NO NUMBER (9));
BEGIN
FOR REC_SA IN (SELECT DISTINCT SUBSCRIBER_NO FROM RPT1120B_T1_new) LOOP
     FP_CHANNEL := '';
     LP_CHANNEL := '';
     REC_COUNT := 0;
     FP_DATE := '';
     LP_DATE := '';
     FOR REC_CHANNEL IN      (SELECT distinct decode(SOC,
                                   29990,'N',
                                   29991,'V',
                                   29993,'M',
                                   29988,'F',
                                   29989,'L',
                                   29992,'S',
                                   29994,'D',
                                   29995,'E',
                                   30277,'C',
                                   30293,'C',
                                   30319,'C',
                                   30359,'C',
                                   30276,'X',
                                   30331,'X',
                                   30299,'X',
                                   30380,'X')      
                    AS CHANNEL,SA.EFFECTIVE_DATE as soc_sts_date
               FROM      SERVICE_AGREEMENT SA
               WHERE      SA.SOC in (          29990,
                                   29991,
                                   29993,
                                   29988,
                                   29989,
                                   29992,
                                   29994,
                                   29995,
                                   30277,
                                   30293,
                                   30319,
                                   30359,
                                   30276,
                                   30331,
                                   30299,
                                   30380) AND
                    SA.AGREEMENT_NO = REC_SA.SUBSCRIBER_NO AND
                    TRUNC(SA.EFFECTIVE_DATE) <> TRUNC(NVL(SA.EXPIRATION_DATE,SYSDATE)) AND
                    SA.EFFECTIVE_DATE = (SELECT MIN(SA1.EFFECTIVE_DATE) FROM SERVICE_AGREEMENT SA1
                              WHERE SA1.AGREEMENT_NO = SA.AGREEMENT_NO AND
                              sa1.soc in (
29990,
                                   29991,
                                   29993,
                                   29988,
                                   29989,
                                   29992,
                                   29994,
                                   29995,
                                   30277,
                                   30293,
                                   30319,
                                   30359,
                                   30276,
                                   30331,
                                   30299,
                                   30380))
                    order by DECODE(channel,'D',1,'M',2,'E',3,'C',4,'X',5,'F',6,'L',7,'N',8,'S',9,'V',10)) LOOP
               REC_COUNT := REC_COUNT + 1;
               if REC_COUNT < 254 then
                    FP_CHANNEL := FP_CHANNEL || REC_CHANNEL.CHANNEL;
               end if;
               FP_DATE := REC_CHANNEL.soc_sts_date;
     END LOOP;
     REC_COUNT := 0;
     FOR REC_CHANNEL IN      (SELECT distinct decode(sa.SOC,
                                   29990,'N',
                                   29991,'V',
                                   29993,'M',
                                   29988,'F',
                                   29989,'L',
                                   29992,'S',
                                   29994,'D',
                                   29995,'E',
                                   30277,'C',
                                   30293,'C',
                                   30319,'C',
                                   30359,'C',
                                   30276,'X',
                                   30331,'X',
                                   30299,'X',
                                   30380,'X')     
                    AS CHANNEL,SA.soc_status_date as soc_sts_date
               FROM      SERVICE_AGREEMENT SA,
                    (SELECT MAX(SA1.soc_status_DATE) as soc_date, agreement_no, soc
                    FROM SERVICE_AGREEMENT SA1
                    WHERE soc in(          29990,
                                   29991,
                                   29993,
                                   29988,
                                   29989,
                                   29992,
                                   29994,
                                   29995,
                                   30277,
                                   30293,
                                   30319,
                                   30359,
                                   30276,
                                   30331,
                                   30299,
                                   30380)
                    GROUP BY agreement_no, soc) sa1
               WHERE      SA.SOC in (      29990,
                                   29991,
                                   29993,
                                   29988,
                                   29989,
                                   29992,
                                   29994,
                                   29995,
                                   30277,
                                   30293,
                                   30319,
                                   30359,
                                   30276,
                                   30331,
                                   30299,
                                   30380) AND
                    SA.soc_status_date = sa1.soc_date AND
                    TRUNC(SA.SOC_STATUS_DATE) <> TRUNC(NVL(SA.EXPIRATION_DATE,SYSDATE)) AND
                    SA.SOC_STATUS = (SELECT MIN(SA2.SOC_STATUS) FROM SERVICE_AGREEMENT SA2
                              WHERE SA2.AGREEMENT_NO = SA.AGREEMENT_NO AND sa2.soc = sa.soc
                              and SA2.soc_status_date = SA.soc_status_date)
                    order by DECODE(channel,'D',1,'M',2,'E',3,'C',4,'X',5,'F',6,'L',7,'N',8,'S',9,'V',10)) LOOP
               REC_COUNT := REC_COUNT + 1;
               if REC_COUNT < 254 then               
                    LP_CHANNEL := LP_CHANNEL || REC_CHANNEL.CHANNEL;
               end if;
               LP_DATE := REC_CHANNEL.soc_sts_date;
     END LOOP;
     INSERT INTO RPT1120B_CHANNEL_new values
          (REC_SA.SUBSCRIBER_NO,
          substr(FP_CHANNEL,1 essageID=1196758, I have hit with above error and the code did run successfully before the minor change in the portion I have bolded. Appreciate your comment & help.
Attached with the code:
CREATE TABLE RPT1120B_CHANNEL_new
(SUBSCRIBER_NO NUMBER (9),
FP_CHANNEL VARCHAR2(255),
FP_DATE DATE,
FP_DYNASTY CHAR(1),
FP_MOVIE CHAR(1),
FP_FUN CHAR(1),
FP_LEARNING CHAR(1),
FP_NEWS CHAR(1),
FP_SPORTS CHAR(1),
FP_VARIETY CHAR(1),
FP_EMPEROR CHAR(1),
FP_SX1 CHAR(1),
FP_CEL CHAR(1),
LP_CHANNEL VARCHAR2(255),
LP_DATE DATE,
LP_DYNASTY CHAR(1),
LP_MOVIE CHAR(1),
LP_FUN CHAR(1),
LP_LEARNING CHAR(1),
LP_NEWS CHAR(1),
LP_SPORTS CHAR(1),
LP_VARIETY CHAR(1),
LP_EMPEROR CHAR(1),
LP_SX1 CHAR(1),
LP_CEL CHAR(1));
--truncate table RPT1120B_CHANNEL;
create or replace PROCEDURE sp_rpt1120b_new
AS
FP_CHANNEL VARCHAR2(255);
FP_DATE DATE;
LP_CHANNEL VARCHAR2(255);
LP_DATE DATE;
REC_COUNT NUMBER(3);
TYPE REC_SA IS RECORD
(AGREEMENT_NO NUMBER (9));
TYPE REC_CHANNEL IS RECORD
(CHANNEL VARCHAR2(3),
AGREEMENT_NO NUMBER (9));
BEGIN
FOR REC_SA IN (SELECT DISTINCT SUBSCRIBER_NO FROM RPT1120B_T1_new) LOOP
     FP_CHANNEL := '';
     LP_CHANNEL := '';
     REC_COUNT := 0;
     FP_DATE := '';
     LP_DATE := '';
     FOR REC_CHANNEL IN      (SELECT distinct decode(SOC,
                                   29990,'N',
                                   29991,'V',
                                   29993,'M',
                                   29988,'F',
                                   29989,'L',
                                   29992,'S',

The error message has an important hint: "number precision too large"
Change REC_COUNT's type from NUMBER(3) to NUMBER and see what happens.

Similar Messages

  • ORA-01841 Error when value for date col is NULL in .dat (using SQL Loader)

    Hello Gurus,
    I have some data in .dat file which needs to be loaded into oracle table. I am using SQL * Loader to do the job. Although "NULLIF col_name =BLANKS" works for character datatype, but when value for date col is NULL then I get ORA-01841 error. I have to make NULL for all rows withour value for date column
    Early reply will be highly appreciated
    Farooq

    Hi,
    May be this problem is not with the NULLIF. The value for the date column is not in proper date format.
    create table:
    create table kk (empno number, ename varchar2(20), deptno number, hiredate date)
    Control file:
    LOAD DATA
    INFILE 'd:\kk\empdata.dat'
    insert into TABLE kk ( empno position (1:2) integer external,
    ename position(4:5) char NULLIF ename=BLANKS,
    deptno position (7:8) integer external NULLIF deptno=BLANKS,
    hiredate position (10:20) date NULLIF hiredate=BLANKS)
    data file:
    10 KK 01-jan-2005
    20 10
    SELECT * FROM KK;
    EMPNO ENAME DEPTNO HIREDATE
    10 KK 01-JAN-05
    20 10
    Verify the data file.
    Hope it will help

  • Listener Error - character string buffer too small (ORA-06502)

    I am running into a very strange problem with the APEX listener on seemingly random pages. I can hit every page in my application just fine, but as soon as I hit this specific one, Glassfish throws an HTTP 500 error. If I click Debug, the page seems to load fine with no indications of trouble, but as soon as I turn debug back off, it goes back to the HTTP 500 error. I don't see anything special about the page that makes this happen. It is pretty simple and has 4 regions. I have noticed that if I set any 1 of the 4 regions to "Never display", the page loads fine. It's like having all 4 of them enabled at once are causing some overload, even though it's actually a smaller amount of data than most of my other pages.
    **Update: I just discovered this only happens if I'm logged into the workspace first and then try to run the page! If I log out of APEX and then hit the application as a normal user, page loads error-free. This is still an annoying problem, but at least it seems I have a decent workaround since regular users never see it.
    After a couple days, I noticed the exact same problem on a second APEX application I'm using. Again, it is on some random page whereas all the other pages work fine.
    Both applications and pages in question worked without issue in APEX 3.x. I am trying to get an APEX application up and running on the latest version.
    I have tried both Glassfish server and simply downloading the latest listener (version 1.1.1) and running it in standalone mode. I get the problem both ways, which is why it seems it's a listener issue.
    I have seen a few other threads of people having this problem, but I never did find anyone with a solution, and most of the posts stopped back in December.
    Some details on my environment:
    Database version: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bi
    APEX version: 4.0.2
    Webserver: Glassfish 3.1
    Here is the log entry from Glassfish when the HTTP 500 error displays as I try to load one of the bugged pages.
    [#|2011-05-10T21:14:22.967-0500|INFO|oracle-glassfish3.1|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=111;_ThreadName=Thread-1;|MaxConnectionReuseCount=50000|#]
    [#|2011-05-10T21:14:46.431-0500|SEVERE|oracle-glassfish3.1|javax.enterprise.system.std.com.sun.enterprise.server.logging|_ThreadID=112;_ThreadName=Thread-1;|
    ***********ERROR***********
    init: # headers=46
    declare nm owa.vc_arr := ?;
    vl owa.vc_arr := ?;
    begin
    owa.init_cgi_env( ?, nm, vl );
    htp.init; htp.HTBUF_LEN := 63;
    ? := sys_context('USERENV','SID');
    end;
    SID:1242
    CALL:
    begin
    f(p=>?);
    commit;
    end;
    BINDS
    p:100:2:220529248574492::NOPAGE CALL:
    declare
    nlns number := 999999;
    l_clob CLOB;
    lines htp.htbuf_arr;
    l_buff varchar2(32767);
    l_clob_init boolean:= false;
    l_file varchar2(5);
    l_doc_info varchar2(1000);
    begin
    OWA.GET_PAGE(lines, nlns);
    if (nlns > 1) then
    for i in 1..nlns loop
    if ( length(lines(i)) > 0 ) then
    if ( ( lengthb(l_buff) + lengthb(lines(i))) > 32767) then
    if (NOT l_clob_init) then
    dbms_lob.createtemporary(l_clob, TRUE);
    dbms_lob.open(l_clob, dbms_lob.lob_readwrite);
    l_clob_init:=true;
    end if;
    dbms_lob.writeappend(l_clob,length(l_buff),l_buff);
    l_buff := lines(i);
    else
    l_buff := l_buff || lines(i);
    end if;
    end if;
    end loop;
    end if;
    if (l_clob_init) then
    dbms_lob.writeappend(l_clob,length(l_buff),l_buff);
    l_buff := '';
    end if;
    ? := l_clob;
    ? := l_buff;
    if (wpg_docload.is_file_download) then l_file:='TRUE'; wpg_docload.get_download_file(l_doc_info); else l_file := 'FALSE'; end if; ? := l_file;
    ? := l_doc_info;
    end;
    get_page FAILED:ORA-06502: PL/SQL: numeric or value error: character string buffer too small
    ORA-06512: at line 33
    Edited by: BrianB on May 11, 2011 7:50 AM
    Edited by: BrianB on May 11, 2011 8:01 AM

    Brian,
    this post is going to get a bit longer, so the summary comes first.
    h6. Summary
    1. I could reproduce the problem on my system using the APEX Listener in standalone mode.
    2. I don't think the problem is content-related in the sense that you have any issue in your page or database contents.
    3. I have a workaround for your problem.
    h6. Error message
    Having that error in my environment made me start to think. I not only disabled some item and got it work, but I could also add something to achieve this effect.
    So I came to think, this really is somewhere deep down. The error message doesn't seem very helpful on first sight, but when starting to follow on what's happening there, things get clearer:
    APEX generates pages dynamically, replacing substitution strings and other tokens to get the actual page definition. This has to be read by a requesting client. The use of VARCHAR2 as buffer introduces a limit of 32767 bytes, after which the contents is handled as CLOB instead.
    h6. Analysis
    Obviously, there are cases where the "estimation" fails. Of course, this is may only be relevant in rare cases, because:
    1. If a page would exceed the maximum without some charset interpretation problem, the buffer would be switched to clob.
    2. If a page stays small enough to stay below 32767 even with some characters that are acutally larger then expected, the buffer isn't busted.
    To find out, if your could be one of these rare cases, I investigated the HTTP headers, focussing on X-DB-Content-length and made an odd observation.
    Test case 1: "Go" button disabled, so the page runs fine with APEX Listener
    1. It has *31968 bytes* when coming from APEX Listener with a Go button disabled.
    2. It has *31938 bytes* according to the header set by EPG - for exactly the same page.
    That makes a difference of 30 bytes for what is expected to be the same contents.
    Test case 2: The button is enabled again
    1. This causes the page load to crash in APEX Listener.
    2. EPG transports *32341 bytes* according to that header.
    So we are pretty close to the hard limit for the VARCHAR2 buffer.
    For some reason, APEX Listener seems to cause a false calculation of the actual page size. Whether this is due to some charset problem or due to some other problem with response handling, I don't know. The 30 bytes difference may result from the odd header "X-ORACLE-IGNORE" with value "IGNORE, IGNORE, IGNORE, IGNORE" sent by the APEX Listener. This value has exactly 30 bytes in length, but this could be coincidence, as there are more differences in headers. If I add the size of all headers, we are even closer to the buffer limit and probably exceed it when some items need more bytes than expected.
    This could even be as simple as a line break, as your page has about 424 lines when I disabled the button... Adding 1 byte per line to the 32341 bytes of the EPG, I get 32765 bytes. Now add that button (403 bytes difference on EPG) and you exceeed the limit. Reduce that value by the line count again and you are still below.
    Could be coincidence as well, but makes me wonder.
    h6. Workaround
    To make sure that we were actually hitting that limit, I now introduce my suggestion for a workaround.
    Test case 3: Add a hidden item
    1. APEX Listener loads the page, stating the size to be *32876 bytes*
    2. EPG sees 30 bytes less and transmits the header with *32846 bytes*
    h6. Conclusion
    I can't give you a real solution for that problem, nor do I have a definitive answer on what is the root cause for it. It seems, only one of the developers may find it. But I can offer you a workaround, which is to just add some hidden item to your page so it exceeds the limit for the VARCHAR2 buffer and gets handled as clob.
    Note that this may occur outside the app builder as well. The app builder just renders some additional items when starting that page, so it has a different size from its productive representation. On the other hand, I may start counting the size of that additional page section - I wouldn't wonder if that results in a value around 400 bytes, and this is the forgotten part...
    Unfortunately, if it actually is happening outside of the app builder, this workaround isn't very handy: Dynamic contents can't be calculated that easy all the time, so you may have cases where you just don't know in advance if you are close to the limit and have to add some item to exceed it or if you've already exceeded it or if you are far below, or close enough to actually hit it when adding just one byte...
    -Udo

  • The value returned from the load function is not of type numeric  errors after migration to Coldfusion 11

    I am currently testing our website with CF11. It is currently working with CF8 however after migrating it to a new server running CF11 I have encountered the following error.
    The value returned from the load function is not of type numeric.
    The error occurred in
    D:/Applications/CFusion/CustomTags/nec/com/objects/address.cfc: line 263
    Called from D:/Applications/CFusion/CustomTags/nec/com/objects/contact.cfc: line 331
    Called from D:/Applications/CFusion/CustomTags/nec/com/objects/user.cfc: line 510
    Called from D:/Applications/CFusion/CustomTags/nec/com/objects/user.cfc: line 1675
    Called from D:/website/NECPhase2/action.validate.cfm: line 54
    261 : <cfif isNumeric(get.idCountry)>
    262 : <cfset rc = this.objCountry.setID(get.idCountry)>
    263 : <cfset rc = this.objCountry.load()>
    264 : </cfif>
    265 : <cfset this.sPostcode = get.sPostcode>
    Have there been any changes between CF8 and CF11 that could  cause this error?
    Does anyone have ideas?

    This is the code in file object file country.cfc (nec.com.objects.country):
    <cfcomponent displayname="Country object" hint="This is a Country object, it allows you to access and set values in the Country.">
    <!---
    // Construct this object
    --->
    <cfset this.objFunctions = CreateObject( 'component', 'nec.com.system.functions' )>
    <cfscript>
      this.idCountryID = 0;
      this.sCountryName = "";
      this.sISOCode = "";
      this.sDHLCode = "";
      this.iErrorID = "";
    </cfscript>
    <!---
    // The following functions are the setters and getters. offering us a better way to get
    // at the contents of the object
    --->
    <!---
    // Getters
    --->
    <cffunction name="getID" displayname="Get ID" returntype="numeric" output="false" hint="This returns the ID of the current item.">
      <cfreturn this.idCountryID>
    </cffunction>
    <cffunction name="getsCountryName" displayname="Get sCountryName" returntype="string" output="false" hint="This gets the sCountryName value of this item.">
      <cfreturn this.sCountryName>
    </cffunction>
    <cffunction name="getsISOCode" displayname="Get sISOCode" returntype="string" output="false" hint="This gets the sISOCode value of this item.">
      <cfreturn this.sISOCode>
    </cffunction>
    <cffunction name="getsDHLCode" displayname="Get sDHLCode" returntype="string" output="false" hint="This gets the sDHLCode value of this item.">
      <cfreturn this.sDHLCode>
    </cffunction>
    <cffunction name="iError" displayname="Get iError" returntype="numeric" output="false" hint="This returns the iError of the current item.">
      <cfreturn this.iError>
    </cffunction>
    <!---
    // Setters
    --->
    <cffunction name="setID" displayname="Set ID" returntype="boolean" output="false" hint="This sets the ID value of this item.">
      <cfargument name="idCountryID" required="true" type="numeric" displayname="ID" hint="The ID to use.">
      <cfset this.idCountryID = arguments.idCountryID>
      <cfreturn true>
    </cffunction>
    <cffunction name="setsCountryName" displayname="Set sCountryName" returntype="boolean" output="false" hint="This sets the sCountryName value of this item.">
      <cfargument name="sCountryName" required="true" type="string" displayname="sCountryName" hint="The sCountryName to use.">
      <cfset this.sCountryName = arguments.sCountryName>
      <cfreturn true>
    </cffunction>
    <cffunction name="setsISOCode" displayname="Set sISOCode" returntype="boolean" output="false" hint="This sets the sISOCode value of this item.">
      <cfargument name="sISOCode" required="true" type="string" displayname="sISOCode" hint="The sISOCode to use.">
      <cfset this.sISOCode = arguments.sISOCode>
      <cfreturn true>
    </cffunction>
    <cffunction name="setsDHLCode" displayname="Set sDHLCode" returntype="boolean" output="false" hint="This sets the sDHLCode value of this item.">
      <cfargument name="sDHLCode" required="true" type="string" displayname="sDHLCode" hint="The sDHLCode to use.">
      <cfset this.sDHLCode = arguments.sDHLCode>
      <cfreturn true>
    </cffunction>
    <!---
    // Clear, to empty out the contents of this object
    --->
    <cffunction name="clear" displayname="Clear items Details" returntype="boolean" output="false" hint="Clears out all of the items details.">
      <cfscript>
       this.sCountryName = "";
       this.sISOCode = "";
       this.sDHLCode = "";
       this.iErrorID = "";
      </cfscript>
      <cfreturn true>
    </cffunction>
    <!---
    // The following functions deal with the load, save and deleting of objects
    --->
    <!---
    // Load
    --->
    <cffunction name="load" displayname="Load items details" returntype="numeric" output="false" hint="This loads in all the information about an item.">
      <cfset rc = this.clear()>
      <!---
      // First of all we need to get the name of the data source we are going to be using
      --->
      <cfscript>
      objDS = CreateObject("component","nec.com.system.settings");
      sDatasource = objDS.getDatasource();
    </cfscript>
      <!---
      // Check to see if it exists
      --->
      <cftry>
       <cfquery name="checkID" datasource="#sDatasource#">
        SELECT idCountryID
        FROM tblCountry
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("load: checkID: '#this.idCountryID#' #cfcatch.detail#");
        </cfscript>
        <cfset this.iErrorID = iErrorID>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfif not checkID.recordCount>
       <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         if(isDefined("session.afr")){
          whichOne = "#session.afr.getsAFRNumber()#";
         } else {
          whichOne = "";
         iErrorID = objError.addError("A Country with that id doesn't exists.[#this.idCountryID#][#whichOne#]");
        </cfscript>
       <cfset this.iErrorID = iErrorID>
       <cfreturn iErrorID>
      </cfif>
      <!---
      // If we got past all then then load in the details
      --->
      <cftry>
       <cfquery name="get" datasource="#sDatasource#">
        SELECT idCountryID, RTRIM(sCountryName) as sCountryName, RTRIM(sISOCode) as sISOCode, RTRIM(sDHLCode) as sDHLCode
        FROM tblCountry
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("load: get: #cfcatch.detail#");
        </cfscript>
        <cfset this.iErrorID = iErrorID>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfset this.idCountryID = get.idCountryID>
      <cfset this.sCountryName = get.sCountryName>
      <cfset this.sISOCode = get.sISOCode>
      <cfset this.sDHLCode = get.sDHLCode>
      <cfset this.iErrorID = "">
      <cfreturn true>
    </cffunction>
    <!---
    // Save
    --->
    <cffunction name="save" displayname="Save items Details" returntype="numeric" output="false" hint="Saves (to some source) the current details for the ID of the item.">
      <!---
      // First of all we need to get the name of the data source we are going to be using
      --->
      <cfscript>
      objDS = CreateObject("component","nec.com.system.settings");
      sDatasource = objDS.getDatasource();
    </cfscript>
      <!---
      // Now check to see if ithat ID exists
      --->
      <cftry>
       <cfquery name="checkID" datasource="#sDatasource#">
        SELECT idCountryID
        FROM tblCountry
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("save: checkID: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <!---
      // If it doesn't exist, then add the record, otherwise update the record
      --->
      <cfif not checkID.recordCount>
       <cfreturn this.add()>
      <cfelse>
       <cfreturn this.update()>
      </cfif>
    </cffunction>
    <!---
    // Add
    --->
    <cffunction name="add" displayname="Add Country" returntype="numeric" output="false" hint="This adds a Country.">
      <!---
      // Check to see if that a different item isn't already using the same unique details
      --->
      <cftry>
       <cfquery name="checkUnique" datasource="#sDatasource#">
        SELECT idCountryID
        FROM tblCountry
        WHERE sCountryName = '#this.objFunctions.scrubText(this.sCountryName)#'
        OR sISOCOde = '#this.objFunctions.scrubText(this.sISOcode)#'
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("add: checkUnique: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfif checkUnique.recordCount>
       <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("A Country with that name or ISO code already exists. idCountryID=#checkUnique.idCountryID#");
        </cfscript>
       <cfreturn iErrorID>
      </cfif>
      <cftry>
       <cfquery name="add" datasource="#sDatasource#">
        SET nocount on
        INSERT INTO tblCountry(sCountryName, sISOCode, sDHLCode)
        VALUES('#this.objFunctions.scrubText(this.sCountryName)#','#this.objFunctions.scrubText(t his.sISOCode)#','#this.objFunctions.scrubText(this.sDHLCode)#')
        SELECT @@identity as autoID
        SET nocount off  
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("add: add: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfreturn add.autoID>
    </cffunction>
    <!---
    // Update
    --->
    <cffunction name="update" displayname="Update Country" returntype="numeric" output="false" hint="This updates a Country record.">
      <!---
      // Check to see if that a different item isn't already using the same unique details
      --->
      <cftry>
       <cfquery name="checkUnique" datasource="#sDatasource#">
        SELECT idCountryID
        FROM tblCountry
        WHERE (sCountryName = '#this.objFunctions.scrubText(this.sCountryName)#'
        OR sISOCOde = '#this.objFunctions.scrubText(this.sISOcode)#')
        AND idCountryID <> #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("update: checkUnique: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfif checkUnique.recordCount>
       <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("Another Country with that name already exists. idCountryID=#checkUnique.idCountryID#");
        </cfscript>
       <cfreturn iErrorID>
      </cfif>
      <!---
      // Attempt to update the record to the datasource
      // if this fails for any reason then we submit an error message
      // to the error component and return the ID of the error
      --->
      <cftry>
       <cfquery name="update" datasource="#sDatasource#">
        UPDATE tblCountry
        SET sCountryName = '#this.objFunctions.scrubText(this.sCountryName)#',
        sISOCode = '#this.objFunctions.scrubText(this.sISOCode)#',
        sDHLCode = '#this.objFunctions.scrubText(this.sDHLCode)#'
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("update: update: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfreturn this.idCountryID> 
    </cffunction>
    <!---
    // Delete
    --->
    <cffunction name="delete" displayname="Delete Country" returntype="numeric" output="false" hint="This deletes a Country record.">
      <!---
      // First of all we need to get the name of the data source we are going to be using
      --->
      <cfscript>
      objDS = CreateObject("component","nec.com.system.settings");
      sDatasource = objDS.getDatasource();
    </cfscript>
      <!---
      // Now check to see if ithat ID exists
      --->
      <cftry>
       <cfquery name="checkID" datasource="#sDatasource#">
        SELECT idCountryID
        FROM tblCountry
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("delete: checkID: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfif not checkID.recordCount>
       <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("A Country with that id doesn't exists, delete failed.");
        </cfscript>
       <cfreturn iErrorID>
      </cfif>
      <!---
      // Now check to see if there are any dependancies, if so we can't delete the item
      --->
      <cftry>
       <cfquery name="checkDependancies" datasource="#sDatasource#">
        SELECT idCountry
        FROM tblAddress
        WHERE idCountry = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("delete: checkDependancies: idCountry: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfif checkDependancies.recordCount>
       <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("That Country is being used by an address, delete failed.");
        </cfscript>
       <cfreturn iErrorID>
      </cfif>
      <!---
      // Now attempt to remove the record.
      // if this fails for any reason then we submit an error message
      // to the error component and return the ID of the error
      --->
      <cftry>
       <cfquery name="delete" datasource="#sDatasource#">
        DELETE FROM tblCountry
        WHERE idCountryID = #this.idCountryID#
       </cfquery>
       <cfcatch>
        <cfscript>
         objError = CreateObject("component","nec.com.system.errors");
         iErrorID = objError.addError("delete: delete: #cfcatch.detail#");
        </cfscript>
        <cfreturn iErrorID>
       </cfcatch>
      </cftry>
      <cfreturn this.idCountryID>
    </cffunction>
    </cfcomponent>

  • Error showing in Login as : ORA-00604:error occured at recursive SQL level

    Hi,
    I am facing problem in Login in to user Test:
    I have created a trigger and built in sys environment:
    CREATE OR REPLACE TRIGGER TEST_LOGON
    AFTER LOGON ON TEST.SCHEMA
    DECLARE
    num INTEGER;
    v_grant VARCHAR2(32767);
    l_username VARCHAR2(30) := 'TEST';
    BEGIN
    IF USER=l_username THEN
    num:=0;
    FOR obj IN (SELECT TABLE_NAME FROM DBA_TABLES
    WHERE TABLE_NAME LIKE 'BS_%') LOOP
    v_grant:='GRANT ALL ON '||obj.TABLE_NAME || ' TO ' || USER;
    EXECUTE IMMEDIATE 'GRANT ALL ON' || obj.TABLE_NAME || ' TO ' || USER;
    num := num + 1;
    END LOOP;
    END IF;
    END;
    The trigger got executed in sys environment.
    But when I am Logging to User as Test its showing Error as:
    ORA-00604:error occured at recursive SQL level 1
    ORA-00990:missing or invalid privilege
    ORA-06512:at line 15
    Kindly any help will be needful for me.
    Thanks and Regards

    user598986 wrote:
    Now its giving Error as :
    ORA-00942:Table or view does not existWell, first of all dynamic grants it is not a good idea. Secondly, it is not a good idea to create objects in sys schema. Now about your trigger. Trigger is always created with definer rights, so in your case trigger was created by SYS and will be executed on behalf of SYS. Now, since trigger is created on TEST.SCHEMA it will be called only when user TEST is logging it. So there is no need for:
    l_username VARCHAR2(30) := 'TEST';
    BEGIN
    IF USER=l_username THENNow you think FOR loop selects tables that start with BS_. Keep in mind, _ is a wildcard for LIKE - it indicates any single character, so FOR loop will also select tables that start with BSA, for example. But this is not all. FOR loop selects matchingl tables in whole databse, so tables can belong to any user. At the same time EXECUTE IMMEDIATE does not specify table owner. Therefore, since trigger is owned by SYS and, as I already mentioned, executes on behalf of SYS, table owner in GRANT statement will default to SYS, not to actual table owner. That is why you get ORA-00942.
    Now the "bad" part. Even if you fix it and provide both owner and table name it still will fail with ORA-30511: invalid DDL operation in system triggers. Why? Check ORA-30511 details:
    ORA-30511: invalid DDL operation in system triggers
    Cause: An attempt was made to perform an invalid DDL operation in a system trigger. Most DDL operations currently are not supported in system triggers. The only currently supported DDL operations are table operations and ALTER?COMPILE operations.
    Action: Remove invalid DDL operations in system triggers.SY.

  • Cannot Update Tabular form Values Error in mru internal routine: ORA-20001:

    i have created some master detail records through a manual tabular form. Iam getting the following error when i tried to updated the values through another inbuilt tabular form....
    Error in mru internal routine: ORA-20001: Error in MRU: row= 1, ORA-20001: ORA-20001: Current version of data in database has changed since user initiated update process. current checksum = "9067F4C5EF14529F831CB42B5567C288", item checksum = "07865E78639EB6477FB5DFB8B02EA047".,

    Hi
    Hopefully my response to your Error in mru internal routine: ORA-20001: no data found in tabular form thread will help. In this thread's example, there is no error message that displays the column names required, but the principle would be the same: Make sure that all fields drawn from the table are included as editable or hidden columns on the report (so that there is one instance of each field shown with a tick in the Edit column on the report's Report Attributes page).
    Andy

  • File: e:\pt849-905-R1-retail\peopletools\SRC\PSRED\psred.cppSQL error. Stmt #: 1849  Error Position: 24  Return: 3106 - ORA-03106: fatal two-task communication protocol error   Failed SQL stmt:SELECT PROJECTNAME FROM PSPROJECTDEFN ORDER

    File:
    e:\pt849-905-R1-retail\peopletools\SRC\PSRED\psred.cppSQL error. Stmt #:
    1849  Error Position: 24  Return: 3106 - ORA-03106: fatal two-task
    communication protocol error
    Failed SQL stmt:SELECT PROJECTNAME FROM PSPROJECTDEFN ORDER
    BY PROJECTNAME
    Got this error when opening the peopletools application designer 8.49. The same is working fine within the server, but not working from the client's machine.
    We can still able to connect to the database & URL
    Please help by throwing some lights.
    Thanks,
    Sarathy K

    Looks like a SQL error. ARe you able to connect to the database with SQL*Plus ? Probably the Oracle client is badly configured.
    Nicolas.

  • Error while passing parameter(quoted string parameter ) to sql script

    Hi all,
    I have a master script insert_attribute_single.sql which takes 6 parameter. when i am using in sql prompt
    SQL>@@INSERT_ATTRIBUTE_SINGLE.SQL 'LEED PROJECT START DATE' 7 'N' 27265185 '7'22'008' NULL;
    then it is giving error for the 5th parameter. and i need to pass 5th parameter '7'22'008' in this.
    in the master script it is giving error - ORA-06550 here.
    dbms_output.put_line('Processing attribute : &1 Project : &4 Char value : &5 Numeric Value : &6 ' ) ;
    can you please help me to resolve this with single quotes in string.
    Thanks in advance.
    regards
    shyam~

    Here is my sql file:
    declare
    a_Var VARCHAR2(10) := '&1';
    begin
    dbms_output.put_line(a_var||','||'&2');
    end;
    /Here is how I am calling the sql file with parameter values containing quotes in itself:
    SQL> @@d:\a.sql '12''''23''''23' '123'
    old   2: a_Var VARCHAR2(10) := '&1';
    new   2: a_Var VARCHAR2(10) := '12''23''23';
    old   4: dbms_output.put_line(a_var||','||'&2');
    new   4: dbms_output.put_line(a_var||','||'123');
    12'23'23,123
    PL/SQL procedure successfully completed.
    SQL>

  • Error while executing the sp ORA-21779: duration not active

    Hi there,
    am using Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 and facing typical type of error
    please find the steps below to reproduce it
    in this basically i will pass a comma seperated values and function will return the
    piped table witheach seperated values as new record
    ex :1,2,3,4
    1
    2
    3
    4
    Types created:
    1) Create Type TPOBJ_Return as Object (tnames varchar2 (2000));
    2) Create Type TPObjT_ReturnColl as Table Of TPOBJ_ReturnTable;
    Function created:
    CREATE OR REPLACE FUNCTION WB_FN_ReturnTable
    tNameString IN VARCHAR2
    RETURN TPObjT_ReturnColl
    PIPELINED
    AS
    iOptionSel INT;
    tOptionSel VARCHAR2 (9);
    iLen INT;
    tName VARCHAR2 (50);
    tTempChar CHAR (1);
    ptNameString VARCHAR2(2000);
    BEGIN
    ptNameString:=tNameString;
    iLen := LENGTH(TRIM(ptNameString));
    iOptionSel := 1;
    tName := '';
    WHILE iOptionSel <= iLen
    LOOP
    tTempChar := SUBSTR(ptNameString, iOptionSel, 1);
    IF tTempChar = ',' THEN
    IF LENGTH(TRIM(tName)) > 0 THEN
    PIPE ROW(TPOBJ_Return(tName));
    END IF;
    tName := '';
    ELSE
    tName := tName || tTempChar;
    END IF;
    iOptionSel := iOptionSel + 1;
    END LOOP;
    IF LENGTH(TRIM(tName)) > 0 THEN
    PIPE ROW(TPOBJ_Return(tName));
    END IF;
    return;
    END;
    Table created:
    Create Table test (id number(16))
    Insert into test values (1)
    Please insert from 1 to 10.
    Stored procedure created:
    Create or replace procedure Sptest
    As
    Titems Varchar2(255);
    pvalue Number(16);
    Begin
    Titems :='5,4,3';
    Select MIN(id) into pvalue from test where id not in
    (select tnames from table(WB_FN_ReturnTable(Titems )));
    End;
    Note:
    while executing the sp for the first time am not getting any error
    only ,if making a repeated call for execution then am gettings the errors specified below
    ORA-21779: duration not active
    ORA-03113: end-of-file on communication channel
    ORA-03114: not connected to ORACLE
    can anyone help me on these issue

    Why a pipeline table function? I would not say that a tokeniser function is something that typically should require working in the SQL engine, piping rows. It can be a very straight forward PL/SQL function that returns a collection of strings.
    E.g.
    SQL> CREATE OR REPLACE function tokenise( line varchar2, separator varchar2 DEFAULT ',' ) return TStrings AUTHID CURRENT_USER is
    2 strList TStrings;
    3 str varchar2(4000);
    4 i integer;
    5 l integer;
    6
    7 procedure AddString( s varchar2 ) is
    8 begin
    9 strList.Extend(1);
    10 strList( strList.Count ) := s;
    11 end;
    12
    13 begin
    14 strList := new TStrings();
    15
    16 str := line;
    17 loop
    18 l := LENGTH( str );
    19 i := INSTR( str, separator );
    20
    21 if i = 0 then
    22 AddString( str );
    23 else
    24 AddString( SUBSTR( str, 1, i-1 ) );
    25 str := SUBSTR( str, i+1 );
    26 end if;
    27
    28 -- if the separator was on the last char of the line, there is
    29 -- a trailing null column which we need to add manually
    30 if i = l then
    31 AddString( null );
    32 end if;
    33
    34 exit when str is NULL;
    35 exit when i = 0;
    36 end loop;
    37
    38 return( strList );
    39 end;
    40 /
    Function created.
    SQL>
    SQL> select * from TABLE(Tokenise('col1,col2,col3,,col5,and so on'));
    COLUMN_VALUE
    col1
    col2
    col3
    col5
    and so on
    6 rows selected.
    SQL>
    PS. The TStrings SQL user type is declared as a table of varchar2(4000).

  • ORA-48108: invalid value given for the diagnostic_dest init.ora parameter

    Hi All,
    I am trying to start my oracle 11g database on windows 7 PC and i am getting below exception
    SQL> startup mount
    ORA-48108: invalid value given for the diagnostic_dest init.ora parameter
    ORA-48140: the specified ADR Base directory does not exist [d:\oracle\app\product\11.2.0\dbhome_1\database\<oracle_base>]
    ORA-48187: specified directory does not exist
    OSD-00002: additional error information
    O/S-Error: (OS 123) The filename, directory name, or volume label syntax is incorrect.
    SQL>
    Earlier it was working fine. For learning purpose, i have created spfile using pfile and after that i got this issue.
    Please help.
    Regards,
    Sunil

    sunil907 wrote:
    Hi,
    I have provided diagnostic_dest folder location (physical path). Now i am getting some different kind of error on startup.
    SQL> startup
    ORACLE instance started.
    Total System Global Area 1068937216 bytes
    Fixed Size                  2182592 bytes
    Variable Size             616563264 bytes
    Database Buffers          444596224 bytes
    Redo Buffers                5595136 bytes
    ORA-00205: error in identifying control file, check alert log for more info
    Please help
    What does your own research of 'ORA-00205' indicate?
    the text of the error message is pretty self explanatory .. it couldn't find the control file.
    The control files are specified by the "control_files"  initialilzation paramter.  When you get this error, the instance has started but was unable to mount the control file.  since the init file (spfile) was processed and the instance started you can easily see what it thinks are the control files.
    oracle:fubar$ sqlplus / as sysdba
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Jul 16 12:51:37 2013
    Copyright (c) 1982, 2009, Oracle.  All rights reserved.
    Connected to an idle instance.
    SQL> startup
    ORACLE instance started.
    Total System Global Area  835104768 bytes
    Fixed Size                  2217952 bytes
    Variable Size             490735648 bytes
    Database Buffers          339738624 bytes
    Redo Buffers                2412544 bytes
    ORA-00205: error in identifying control file, check alert log for more info
    SQL> show parameter control
    NAME                                 TYPE        VALUE
    control_file_record_keep_time        integer     7
    control_files                        string      /u01/app/oracle/oradata/FUBAR/
                                                     controlfile/o1_mf_8ybx4t7w_.ct
                                                     x, /u01/app/oracle/flash_recov
                                                     ery_area/FUBAR/controlfile/o1_
                                                     mf_8ybx4tom_.ctl
    control_management_pack_access       string      NONE
    SQL>
    So what did you do in fixing your original problem that caused your control_files parameter to go south?

  • Error in mru internal routine: ORA-20001: Error in MRU: row= 1, ORA-01400:

    my application on apex.oracle.com.
    http://apex.oracle.com/pls/otn/f?p=19391:1:931174149200985:::::
    login : guest pwd : 123
    To get the error
    1. click on edit button on daily entry.
    2. click on add button on daily_entry_detail
    3. enter party name
    4 enter qty
    and click on add row to add 2nd row
    following error
    Error in mru internal routine: ORA-20001: Error in MRU: row= 1, ORA-01400: cannot insert NULL into ("PTEST"."DAILY_ENTRY_DETAILS"."ENTRY_DATE"), insert into "PTEST"."DAILY_ENTRY_DETAILS" ( "ENTRY_DATE", "PARTY_NAME", "QTY") values ( :b1, :b2, :b3)     Error     Unable to process update

    Hi
    I am going to assume that ENTRY_DATE should be the date that the user created the record - otherwise, you will have to add this field to your form to get the user to fill it in.
    You need to check how your DAILY_ENTRY_DETAILS table is set up.
    Go to SQL Workshop, Object Browser and in the list of tables, select DAILY_ENTRY_DETAILS, then click on the SQL option above the table definition and you should see something like:
    CREATE TABLE  "A_LOV1"
       (     "LOV1" NUMBER,
         "LOV1_NAME" VARCHAR2(100),
          CONSTRAINT "A_LOV1_PK" PRIMARY KEY ("LOV1") ENABLE
    CREATE OR REPLACE TRIGGER  "BI_A_LOV1"
      before insert on "A_LOV1"
      for each row
    begin
      if :NEW."LOV1" is null then
        select "A_LOV1_SEQ".nextval into :NEW."LOV1" from dual;
      end if;
    end;
    ALTER TRIGGER  "BI_A_LOV1" ENABLE
    /(This is based on one of my tables, so the names etc will be different for you).
    In the first block, the CONSTRAINT line tells you whether or not you have a primary key on the table - in the example above, the primary key field is LOV1. If this is not set up, you must do this - click on the Constraints option at the top of the page and click Create then follow the prompts.
    The next block is a TRIGGER. This is set to run whenever a record is inserted into the table. A record being inserted is referred to as :NEW. You can see from the example that, in this case, when the record is being inserted, the value in the Primary Key is checked. If it is null, then a new sequence number is generated (see below) and the record is updated with this number before the record is saved.
    The final block just switches the trigger on
    You will see that this trigger says "select A_LOV1_SEQ.nextval...". This refers to a "sequence" object. Go back to the Object Browser and select Sequences. In my example, there is a sequence called A_LOV_SEQ which, if I select it and look at the sql, it is defined as:
    CREATE SEQUENCE "A_LOV1_SEQ"  MINVALUE 1 MAXVALUE 99999999999999999999999 INCREMENT BY 1 START WITH 1 NOCACHE NOORDER NOCYCLEThis is a counter that is used by the trigger to get the next Primary Key value for the record.
    If you don't have the sequence, you will need to set it up. This can be done by clicking the Create button at the top right and following the prompts. Or you can copy the above, change the sequence name as required and then run this in the SQL Commands window
    If you don't have the trigger, you can copy the code above and edit it to suit your table needs.
    Now for the ENTRY_DATE issue. If this is to be automatically set to the current date, you can handle this in the trigger. For example - assume I had that field on my table, then my trigger would be:
    CREATE OR REPLACE TRIGGER  "BI_A_LOV1"
      before insert on "A_LOV1"
      for each row
    begin
      if :NEW."LOV1" is null then
        select "A_LOV1_SEQ".nextval into :NEW."LOV1" from dual;
      end if;
      if :NEW."ENTRY_DATE" is null then
        :NEW."ENTRY_DATE" := SYDATE;
      end if;
    end;
    ALTER TRIGGER  "BI_A_LOV1" ENABLE
    /SYSDATE just means the current date/time
    The CREATE and ALTER statements must be run separately in the SQL Commands window
    Once you have that set up correctly, your form should work
    Andy

  • How to fix error in mru internal routine: ORA-20001?

    Hi,
    I created a tabular form to manage resources.
    But I'm not able to add new resource.
    There is a sequence as below.
    CREATE SEQUENCE "PM_PERSON_SEQ" MINVALUE 1 MAXVALUE 999999999999999999999999999 INCREMENT BY 1 START WITH 21 CACHE 20 NOORDER NOCYCLE /
    There is a trigger as below.
    create or replace trigger "BI_PM_PERSON"
    before insert on "PM_PERSON"
    for each row
    begin
    if :NEW."ID" is null then
    select "PM_PERSON_SEQ".nextval into :NEW."ID" from dual;
    end if;
    end;
    Whenever I try to add new resource with his name and email id, I get an error message as below.
    "Error in mru internal routine: ORA-20001: Error in MRU: row= 1, ORA-00001: unique constraint (SDS.PM_PERSON_PK) violated, insert into "SDS"."PM_PERSON" ( "ID", "NAME", "EMAIL_ID") values ( :b1, :b2, :b3)"
    I made a change as follows:
    Processes: ApplyMRU > Condition Type: Value of Item in Expression 1 Is NOT NULL
    This did not show the error message but did not add new resource.
    How can I solve this problem?
    Thanks,
    Guy
    h5. FYI, I'm very new to SQL, PL/SQL and APEX. Would appreciate little more explanation and full path (eg. Shared Components > Edit Security Attributes > VPD)

    Hi guy,
    You seems to have everything set.
    Have you checked whether there's an overlap between the current data in the table and PM_PERSON_SEQ next value?
    Cheers,

  • Error in asfn(rs[[i]]) : need explicit units for numeric conversion

    Hi
    I am using the sqldf package in R to do some joins of my data table as follows:
    p1<-sqldf("SELECT t3.* FROM p as t3 JOIN (SELECT userid, MAX(type) as LastActivityType FROM p GROUP BY userid) t4 ON t3.userid = t4.userid AND t3.type = t4.LastActivityType")
    However , I am getting the error mentioned in the subject line
    The error doesn't appear  when I change the column names from t3.* and limit it to a subset of column names ( t3.userid, t3.LastActivity....).
    Surprisingly though, I have another similar sql script in the same code block which runs just fine.
    My code runs fine in RStudio, so I guess there's some incompatibility with the Azure environment somewhere. For reference, the sqldf package and its dependencies (imported separately as a zip) is compatible with R  3.1.2
    Can someone help me solve this? 
    Thanks in advance!!

    Hi AK
    Thanks for the quick response 
    So I checked the following:
    1) Downloaded the dataset from AML from the step just previous to where my sql script appears, fed it into RStudio with stringsAsFactors=FALSE and thereupon ran the sql script. It works without any issues. It works fine if I dont set stringsAsFactors
    as false as well
    2) Checked the CRAN documentation on the sqldf package, it mentions that the package is valid for versions>=R3.1.0
    3)All 4 dependencies were included. Only one was preinstalled. The output log says: 
    warning: package 'proto' is in use and will not be installedso I removed the package 'proto' just in case it was causing trouble.Nothing Works :/I am pasting the output log, could you have a look and see if I am missing anything obvious?Thanks a ton!UpasanaOutput Log:Record Starts at UTC 11/27/2014 07:38:27:
    Run the job:"/dll "ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca;Microsoft.MetaAnalytics.RDataSupport.ExecuteRScript;Run" /Output0 "..\..\Result Dataset\Result Dataset.dataset" /Output1 "..\..\R Device\R Device.dataset" /dataset1 "..\..\Dataset1\Dataset1.csv" /bundlePath "..\..\Script Bundle\Script Bundle.zip" /rStreamReader "script.R" "
    Starting process 'C:\Resources\directory\275cc759e61f4dbf889cde5e5cba0835.SingleNodeRuntimeCompute.Packages\AFx\5.1\DllModuleHost.exe' with arguments ' /dll "ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca;Microsoft.MetaAnalytics.RDataSupport.ExecuteRScript;Run" /Output0 "..\..\Result Dataset\Result Dataset.dataset" /Output1 "..\..\R Device\R Device.dataset" /dataset1 "..\..\Dataset1\Dataset1.csv" /bundlePath "..\..\Script Bundle\Script Bundle.zip" /rStreamReader "script.R" '
    [ModuleOutput] DllModuleHost Start: 1 : Program::Main
    [ModuleOutput] DllModuleHost Start: 1 : DataLabModuleDescriptionParser::ParseModuleDescriptionString
    [ModuleOutput] DllModuleHost Stop: 1 : DataLabModuleDescriptionParser::ParseModuleDescriptionString. Duration: 00:00:00.0050545
    [ModuleOutput] DllModuleHost Start: 1 : DllModuleMethod::DllModuleMethod
    [ModuleOutput] DllModuleHost Stop: 1 : DllModuleMethod::DllModuleMethod. Duration: 00:00:00.0000572
    [ModuleOutput] DllModuleHost Start: 1 : DllModuleMethod::Execute
    [ModuleOutput] DllModuleHost Start: 1 : DataLabModuleBinder::BindModuleMethod
    [ModuleOutput] DllModuleHost Verbose: 1 : moduleMethodDescription ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca;Microsoft.MetaAnalytics.RDataSupport.ExecuteRScript;Run
    [ModuleOutput] DllModuleHost Verbose: 1 : assemblyFullName ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca
    [ModuleOutput] DllModuleHost Start: 1 : DataLabModuleBinder::LoadModuleAssembly
    [ModuleOutput] DllModuleHost Verbose: 1 : Trying to resolve assembly : ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca
    [ModuleOutput] DllModuleHost Verbose: 1 : Loaded moduleAssembly ExecuteRScript, Version=5.1.0.0, Culture=neutral, PublicKeyToken=69c3241e6f0468ca
    [ModuleOutput] DllModuleHost Stop: 1 : DataLabModuleBinder::LoadModuleAssembly. Duration: 00:00:00.0067974
    [ModuleOutput] DllModuleHost Verbose: 1 : moduleTypeName Microsoft.MetaAnalytics.RDataSupport.ExecuteRScript
    [ModuleOutput] DllModuleHost Verbose: 1 : moduleMethodName Run
    [ModuleOutput] DllModuleHost Information: 1 : Module FriendlyName : Execute R Script
    [ModuleOutput] DllModuleHost Information: 1 : Module Release Status : Release
    [ModuleOutput] DllModuleHost Stop: 1 : DataLabModuleBinder::BindModuleMethod. Duration: 00:00:00.0106972
    [ModuleOutput] DllModuleHost Start: 1 : ParameterArgumentBinder::InitializeParameterValues
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos count = 5
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos[0] name = dataset1 , type = Microsoft.Numerics.Data.Local.DataTable
    [ModuleOutput] DllModuleHost Start: 1 : DataTableCsvHandler::HandleArgumentString
    [ModuleOutput] DllModuleHost Stop: 1 : DataTableCsvHandler::HandleArgumentString. Duration: 00:00:13.1522942
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos[1] name = dataset2 , type = Microsoft.Numerics.Data.Local.DataTable
    [ModuleOutput] DllModuleHost Verbose: 1 : Set optional parameter dataset2 value to NULL
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos[2] name = bundlePath , type = System.String
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos[3] name = rStreamReader , type = System.IO.StreamReader
    [ModuleOutput] DllModuleHost Verbose: 1 : parameterInfos[4] name = seed , type = System.Nullable`1[System.Int32]
    [ModuleOutput] DllModuleHost Verbose: 1 : Set optional parameter seed value to NULL
    [ModuleOutput] DllModuleHost Stop: 1 : ParameterArgumentBinder::InitializeParameterValues. Duration: 00:00:13.1845731
    [ModuleOutput] DllModuleHost Verbose: 1 : Begin invoking method Run ...
    [ModuleOutput] Microsoft Drawbridge Console Host [Version 1.0.2108.0]
    [ModuleOutput] [1] 56000
    [ModuleOutput]
    [ModuleOutput] The following files have been unzipped for sourcing in path=["src"]:
    [ModuleOutput]
    [ModuleOutput] Name Length Date
    [ModuleOutput]
    [ModuleOutput] 1 sqldf_0.4-10.zip 71667 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] 2 chron_2.3-45.zip 107752 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] 3 DBI_0.3.1.zip 153831 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] 4 gsubfn_0.6-6.zip 348505 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] 5 proto_0.3-10.zip 458519 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] 6 RSQLite_1.0.0.zip 1211130 2014-11-21 11:38:00
    [ModuleOutput]
    [ModuleOutput] Loading objects:
    [ModuleOutput]
    [ModuleOutput] port1
    [ModuleOutput]
    [ModuleOutput] [1] "Loading variable port1..."
    [ModuleOutput]
    [ModuleOutput] package 'gsubfn' successfully unpacked and MD5 sums checked
    [ModuleOutput]
    [ModuleOutput] Loading required package: proto
    [ModuleOutput]
    [ModuleOutput] [1] TRUE
    [ModuleOutput]
    [ModuleOutput] package 'DBI' successfully unpacked and MD5 sums checked
    [ModuleOutput]
    [ModuleOutput] [1] TRUE
    [ModuleOutput]
    [ModuleOutput] package 'RSQLite' successfully unpacked and MD5 sums checked
    [ModuleOutput]
    [ModuleOutput] [1] TRUE
    [ModuleOutput]
    [ModuleOutput] package 'sqldf' successfully unpacked and MD5 sums checked
    [ModuleOutput]
    [ModuleOutput] [1] TRUE
    [ModuleOutput]
    [ModuleOutput] Loading required package: tcltk
    [ModuleOutput]
    [ModuleOutput] Error in asfn(rs[[i]]) : need explicit units for numeric conversion
    [ModuleOutput]
    [ModuleOutput] In addition: Warning messages:
    [ModuleOutput]
    [ModuleOutput] 1: In strptime(x, format, tz = tz) :
    [ModuleOutput]
    [ModuleOutput] unable to identify current timezone 'C':
    [ModuleOutput]
    [ModuleOutput] please set environment variable 'TZ'
    [ModuleOutput]
    [ModuleOutput] 2: In strptime(x, format, tz = tz) : unknown timezone 'localtime'
    [ModuleOutput]
    [ModuleOutput] 3: package 'gsubfn' was built under R version 3.1.2
    [ModuleOutput]
    [ModuleOutput] 4: package 'DBI' was built under R version 3.1.2
    [ModuleOutput]
    [ModuleOutput] 5: package 'RSQLite' was built under R version 3.1.2
    [ModuleOutput]
    [ModuleOutput] 6: package 'sqldf' was built under R version 3.1.2
    [ModuleOutput]
    [ModuleOutput] 7: In strptime(xx, f <- "%Y-%m-%d %H:%M:%OS", tz = tz) :
    [ModuleOutput]
    [ModuleOutput] unknown timezone 'localtime'
    [ModuleOutput]
    [ModuleOutput] DllModuleHost Stop: 1 : DllModuleMethod::Execute. Duration: 00:02:41.0341394
    [ModuleOutput] DllModuleHost Error: 1 : Program::Main encountered fatal exception: Microsoft.Analytics.Exceptions.ErrorMapping+ModuleException: Error 0063: The following error occurred during evaluation of R script:
    [ModuleOutput] ---------- Start of error message from R ----------
    [ModuleOutput] R script execution failed. Please click on "View Output Log" in the properties pane for full details.
    [ModuleOutput] ----------- End of error message from R -----------
    Module finished after a runtime of 00:02:41.1766291 with exit code -2
    Module failed due to negative exit code of -2
    Record Ends at UTC 11/27/2014 07:41:12.

  • Error while Connecting to forms10g(ORA-00604)

    Hi all,
    I installed forms10g and reports10g.
    when I am trying to connect database I am getting ORA-00604-Error occured at recursive SQL level1
    Operating system:windows home basic vista
    Thanks & Regards
    venkat
    Edited by: venkata on Jul 26, 2009 10:52 PM

    this is a database message
    you may have to increase shared_pool_size in your init%sid%.ora parameter-file of the database.
    what's the actual value of this parameter in your database?
    you can obtain this value with SQL* Plus too:
    select value from v$parameter where name = 'shared_pool_size'; you have to restart (shutdown / startup) your instance after change this value in the parameter-file.
    if your database-version is 10 or above (may already with 9i),
    you can set this parameter "online" with sql* plus:
    alter system set shared_pool_size = 10000000;

  • Msg 8631 Internal error: Server stack limit has been reached on SQL Server 2012 from T-SQL script that runs on SQL Server 2008 R2

    I have an Script mostly that is generated by SSMS which works with-out issue on SQL Server 2008, but when I attempt to run it on a new fresh install of SQL Server 2012 I get an Msg 8631. Internal error: Server stack limit has been reached. Please look for
    potentially deep nesting in your query, and try to simplify it.
    The script itself doesn't seem to be all that deep or nested.  The script is large 2600 lines and when I remove the bulk of the 2600 lines, it does run on SQL Server 2012.  I'm just really baffled why something that SQL Server generated with very
    few additions/changes AND that WORKS without issue in SQL Server 2008 R2 would suddenly be invalid in SQL Server 2012
    I need to know why my script which is working great on our current SQL Server 2008 R2 servers suddenly fails and won't run on an new SQL Server 2012 server.  This script is used to create 'bulk' Replications on a large number of DBs saving a tremendous
    amount of our time doing it the manual way.
    Below is an 'condensed' version of the script which fails.  I have removed around 2550 lines of specific sp_addarticle statements which are mostly just copy and pasted from what SQL Management Studio 'scripted' for me went I when through the Replication
    Wizard and told it to save to script.
    declare @dbname varchar(MAX), @SQL nvarchar(MAX)
    declare c_dblist cursor for
    select name from sys.databases WHERE name like 'dbone[_]%' order by name;
    open c_dblist
    fetch next from c_dblist into @dbname
    while @@fetch_status = 0
    begin
    print @dbname
    SET @SQL = 'DECLARE @dbname NVARCHAR(MAX); SET @dbname = ''' + @dbname + ''';
    use ['+@dbname+']
    exec sp_replicationdboption @dbname = N'''+@dbname+''', @optname = N''publish'', @value = N''true''
    use ['+@dbname+']
    exec ['+@dbname+'].sys.sp_addlogreader_agent @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1, @job_name = null
    -- Adding the transactional publication
    use ['+@dbname+']
    exec sp_addpublication @publication = N'''+@dbname+' Replication'', @description = N''Transactional publication of database
    '''''+@dbname+''''' from Publisher ''''MSSQLSRV\INSTANCE''''.'', @sync_method = N''concurrent'', @retention = 0, @allow_push = N''true'', @allow_pull = N''true'', @allow_anonymous = N''false'', @enabled_for_internet
    = N''false'', @snapshot_in_defaultfolder = N''true'', @compress_snapshot = N''false'', @ftp_port = 21, @allow_subscription_copy = N''false'', @add_to_active_directory = N''false'', @repl_freq = N''continuous'', @status = N''active'', @independent_agent = N''true'',
    @immediate_sync = N''true'', @allow_sync_tran = N''false'', @allow_queued_tran = N''false'', @allow_dts = N''false'', @replicate_ddl = 1, @allow_initialize_from_backup = N''true'', @enabled_for_p2p = N''false'', @enabled_for_het_sub = N''false''
    exec sp_addpublication_snapshot @publication = N'''+@dbname+' Replication'', @frequency_type = 1, @frequency_interval = 1, @frequency_relative_interval = 1, @frequency_recurrence_factor = 0, @frequency_subday = 8,
    @frequency_subday_interval = 1, @active_start_time_of_day = 0, @active_end_time_of_day = 235959, @active_start_date = 0, @active_end_date = 0, @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1
    -- There are around 2400 lines roughly the same as this only difference is the tablename repeated below this one
    use ['+@dbname+']
    exec sp_addarticle @publication = N'''+@dbname+' Replication'', @article = N''TABLE_ONE'', @source_owner = N''dbo'', @source_object = N''TABLE_ONE'', @type = N''logbased'', @description = null, @creation_script =
    null, @pre_creation_cmd = N''drop'', @schema_option = 0x000000000803509F, @identityrangemanagementoption = N''manual'', @destination_table = N''TABLE_ONE'', @destination_owner = N''dbo'', @vertical_partition = N''false'', @ins_cmd = N''CALL sp_MSins_dboTABLE_ONE'',
    @del_cmd = N''CALL sp_MSdel_dboTABLE_ONE'', @upd_cmd = N''SCALL sp_MSupd_dboTABLE_ONE''
    EXEC sp_executesql @SQL
    SET @dbname = REPLACE(@dbname, 'dbone_', 'dbtwo_');
    print @dbname
    SET @SQL = 'DECLARE @dbname NVARCHAR(MAX); SET @dbname = ''' + @dbname + ''';
    use ['+@dbname+']
    exec sp_replicationdboption @dbname = N'''+@dbname+''', @optname = N''publish'', @value = N''true''
    use ['+@dbname+']
    exec ['+@dbname+'].sys.sp_addlogreader_agent @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1, @job_name = null
    -- Adding the transactional publication
    use ['+@dbname+']
    exec sp_addpublication @publication = N'''+@dbname+' Replication'', @description = N''Transactional publication of database
    '''''+@dbname+''''' from Publisher ''''MSSQLSRV\INSTANCE''''.'', @sync_method = N''concurrent'', @retention = 0, @allow_push = N''true'', @allow_pull = N''true'', @allow_anonymous = N''false'', @enabled_for_internet
    = N''false'', @snapshot_in_defaultfolder = N''true'', @compress_snapshot = N''false'', @ftp_port = 21, @allow_subscription_copy = N''false'', @add_to_active_directory = N''false'', @repl_freq = N''continuous'', @status = N''active'', @independent_agent = N''true'',
    @immediate_sync = N''true'', @allow_sync_tran = N''false'', @allow_queued_tran = N''false'', @allow_dts = N''false'', @replicate_ddl = 1, @allow_initialize_from_backup = N''true'', @enabled_for_p2p = N''false'', @enabled_for_het_sub = N''false''
    exec sp_addpublication_snapshot @publication = N'''+@dbname+' Replication'', @frequency_type = 1, @frequency_interval = 1, @frequency_relative_interval = 1, @frequency_recurrence_factor = 0, @frequency_subday = 8,
    @frequency_subday_interval = 1, @active_start_time_of_day = 0, @active_end_time_of_day = 235959, @active_start_date = 0, @active_end_date = 0, @job_login = N''DOMAIN\DBServiceAccount'', @job_password = N''secret'', @publisher_security_mode = 1
    -- There are around 140 lines roughly the same as this only difference is the tablename repeated below this one
    use ['+@dbname+']
    exec sp_addarticle @publication = N'''+@dbname+' Replication'', @article = N''DB_TWO_TABLE_ONE'', @source_owner = N''dbo'', @source_object = N''DB_TWO_TABLE_ONE'', @type = N''logbased'', @description = null, @creation_script
    = null, @pre_creation_cmd = N''drop'', @schema_option = 0x000000000803509D, @identityrangemanagementoption = N''manual'', @destination_table = N''DB_TWO_TABLE_ONE'', @destination_owner = N''dbo'', @vertical_partition = N''false''
    EXEC sp_executesql @SQL
    fetch next from c_dblist into @dbname
    end
    close c_dblist
    deallocate c_dblist
    George P Botuwell, Programmer

    Hi George,
    Thank you for your question. 
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated. 
    Thank you for your understanding and support.
    If you have any feedback on our support, please click
    here.
    Allen Li
    TechNet Community Support

Maybe you are looking for

  • How can I see a contact's entire phone number when iphone has large text?

    Hi everybody, I changed the font on my mom's iphone 5 to a larger size (but not the largest available) so it's easier to read. However, now when trying to view one of her contact's phone numbers, the number cuts off at seven digits and follows with d

  • There is data in the cloud from my old phone, But I can't access it

    Hi- I bought iPhone 6 on day one since the line was not long, but had not backed up to iTunes.  The tech at the store said he could put the photos in the cloud. He apparently did since there is 822mb of data listed from my old phone. Now I need a pic

  • In APE 10 I cannot ADD MEDIA to an instant movie (still pics)

    How do I change the settings to allow transfer of pic folder from Elements 10 to create an instant movie in APE 10?  I continually get an error message that states the pics are too large in dimensions (width/height)??  Any directions would be appreci

  • FTP? where did it go

    What gives with apple removing FTP from sharing and where did windows file sharing go? ugh why is apple removing features?

  • BIA Problen

    Hi, We are Getting differet values for the same reports when BIA is ON and when BIA is OFF. Note that we have not used any non cumulative key figures in our report , ours is very simple report. Please anyone help in the issue. Thanks in advance. Than