DRG-11427: valid gist level values are S, P

when i want to create a gist, appears this error, but i don't understand what it means.
I ve found in documentation:
Specify valid gist leved..but what it means?
SQL> begin
2 Ctx_Doc.Gist (
3 index_name => 'realistic_docs_text',
4 textkey => 2,
5 restab => 'ctx_gists',
6 query_id => 2,
7 pov => 'GENERIC',
8 glevel => 'paragraph',
9 numparagraphs => 2,
10 maxpercent => 100 );
11 end;
12 /
begin
ERROR at line 1:
ORA-20000: Oracle Text error:
DRG-11425: gist level paragraph is invalid
DRG-11427: valid gist level values are S, P
ORA-06512: at "CTXSYS.DRUE", line 160
ORA-06512: at "CTXSYS.CTX_DOC", line 401
ORA-06512: at line 2

OK.I've solved!
Solution:
glevel => 'paragraph', wrong!!!
glevel => 'P', ok!

Similar Messages

  • Payscale group and levels values are not appearing in IT0008

    Hi Friends,
    I craeted Payscale group & Level in V_T510. Eventhough these values are not coming as F4 (drop down)values in IT0008.
    Pls adivice where I have missed the configuration.
    Thanks

    Hi..
    ESG/CAP and and its grouping in V-t510 have already been maintained. however still it is not coming as drop down values.
    Pls suggest to resolve.
    Thanks

  • Is there a way to fix particular percent of slice area to each level value

    Is there a way to fix particular percent of slice area to each level value in a flash pie Chart?
    I need a pie chart for Distribution of Escalations by Status; here I have following status Record in my table
    5 record for Status: Reopen
    2 record for Status: Escalated
    2015 record for Status: Closed
    12 record for Status: Open
    1000 record for Status: WIP
    So I am not able to see data in pie chart for Status Reopen, Escalated, and Open, bcz all levels are overlapping each other
    So I want to fix particular percent of slice area to each level value(Reopen, Escalated, Closed , WIP and Open)
    For e.g. I want to show 15% slice area of chart for Escalated, 10% for Reopen, 20% for Open
    25% for WIP and 30% for Closed. So i can see record for all status while that have less data as compare to other
    status.
    I am using following query for pie Chart….
    SELECT
    ''javascript:dhtml_GetReport_r2(0,0,0,0,0,''''''||''''||STATUS||''''||'''''',0)'' link,
    nvl(STATUS,''Unknown'') label,
    COUNT(ISSUE_ID) Escalations
    FROM XYZ_ITR_MAIN
    WHERE 1=1 GROUP BY STATUS;
    Thanks
    Rathore
    Edited by: Rathore on 01-Apr-2010 02:37

    Your requirement makes sense from the point of visibility but doesn't make sense having the facts in mind that 2 out of 3034 is always 0,000659195% no matter how you look at it. And a pie chart is always going to give you exactly that. The only thing you can do is to manipulate your data but then what a user will see is not going to match with the reality. So, the only solution I see is to make a bigger chart. Exploding a pie chart is also an alternative but I don't think you can do that currently with the flash charts.
    Denes Kubicek
    http://deneskubicek.blogspot.com/
    http://www.opal-consulting.de/training
    http://apex.oracle.com/pls/otn/f?p=31517:1
    http://www.amazon.de/Oracle-APEX-XE-Praxis/dp/3826655494
    -------------------------------------------------------------------

  • User entered values are lost on EO validation.

    Hi ,
    Subject : user entered values are lost on EO validation.
    WE have an EO which has 4 attributes all are not null (mandatory)
    There is a VO based on this EO
    We have added 4 more attributes using a join in EO query using expert mode.
    On the page there is a Table , that has 8 columns all mapped to these 8 attributes.
    The First column is an LOv that populates 4 column attributes(2 EO based and 2 added thru expert mode)
    When the user submits the page, without entering rest of the two EO mandatory atributes, the EO validation is raised and the 4 fields populated by LOv are cleared.
    Can you pls tell me what am i doing wrong.
    thanks
    Chaitanya

    Observation
    All the table columns that were loosing information were "MessageStyledText" and those not loosing were "MessageTextInput"
    Various approaches
    SO we converted the MessageStyledText columns into MessageTextInput and the data was retaining properly.
    But we wanted these columns are 'ReadOnly' so when we made these columns are MessageTextInput - ReadOnly, then data was not getting popluated by the lov in these.
    Working solution - The FormValue 'Glue'
    We created formValues correcponding columns and populated the FVs also along with the column (both FV and column have same ViewAtt and View instance) and found that in this case the formvalues 'Glues' the VO att value with the page and does not get lost on refresh.
    Strange solution but works, thought to share with all.
    thanks
    Chaitanya

  • By default in BPC values are sum in node level, It does not make sense to sum averages on parent nodes.

    HI Guys
    we have an business requirement to store data in average figures ,
    The values  are entered on the base member level but in reports we need to show average values  on the parents level instead of sum
    ( by default in BPC values are sum in node level),it does not make sense to sum averages on parent nodes.
    we managed it with excel formulas ,but we r facing the problem with dimensions which are in context.
    (i tried using dim memeber formulas but AVG key word is not accepting the system in member formulas)
    at last by using formula we managed the report but im worried abt that if there is no formulas ,how we will manage
    pls suggest any solution to show average values on the node level in specific reports

    Hi Krao,
    Please define what do you mean by average - arithmetic average or weighted average (has more business sense to my mind).
    Also, please provide some samples, explaining dimensions used in average calculation.
    In our case we use dimension member formulas to show average prices, discounts etc. in the reports.
    B.R. Vadim

  • Content Organizer bug - PDF files does not get routed correctly if autodeclaration is on and library level default values are set

    It looks like whenever one specifies Column default values at a library level then the content organizer routing goes a bit awry SPECIFICALLY FOR NON OFFICE FILES [e.g. PDF] . Below are the observations and issues
    1. Column level default value set on a record library with auto declaration of records turned on.  The content organizer routes the document to the library but also keeps a copy of the document in the drop off library. It does not remove it from the drop
    off library. The instant we clear the default value settings at the library level of the target library the content organizer works as expected again. 
    2. If default value settings are specified on a column in the target library then the PDF file gets routed to the document library but all the metadata is blanked out. The copy of the file that remains in the drop off library has all the correct metadata but
    the target library has blanked out metadata. 
    Are the 2 observations described above by design or are they bugs? If so is there any documentation that is available that proves this because this does not make logical sense and proving this to a client in the absence of any documentation is a challenge.
    The problem goes away if we shift the default value to the site columns directly at the site collection level. It's just the library level defaults that the pdf files do not seem to agree with

    Hi Lisa,
    Thanks for responding. This can be replicated in any environment but is only replicable for a specific combination of content organizer settings . The combination of settings I am referring to can be seen in the screenshot below. If you turn off redirect
    users to the drop off library for direct uploads to libraries and if you turn on sharepoint versioning then you should be able to replicate the issue. Also we are using managed metadata site columns. I simplified this use case to a custom content content type
    with just 2 custom managed metadata columns and can still replicate the issue in several environments. Also note the issue does not occur if the default values are set at the site or site collection level. It only occurs if you set the column value default
    at a library level.  I was able to replicate this on a completely vanilla Enterprise records site collection freshly created just to test this.  Also note that the issue is not that the file does not reach the destination library. The issue is the
    document does not get removed from the drop off library after it is transfered to the destination library which technically should have gotten removed.

  • Updated (VORowImpl ) values are not reflected in inline POPUP table

    Hi Expert,
    Currently i am getting exception when i try to update my iterator binding,
    Here is my use case,
    I have view object displayed inside the inline popup as a table. When i modify one of the cell i am firing the value change listener and it is called the ViewObjectRowImpl class method. Inside the method i do some computation (i am executing some DB query to get back some value). After the query execution i am updating the other columns data based on the changes.
    First i have PPR the table and check out the update values. Unfortunately the change values are not reflected.
    Then i used the following code
    DCIteratorBinding itrBinding = findIteratorBinding("EmployeeVO1Iterator");
    if (itrBinding != null) {
    itrBinding.executeQuery();
    itrBinding.refresh(DCIteratorBinding.RANGESIZE_UNLIMITED);
    Now i am getting the " JBO-25003: Object EmployeeMgmtAM of type ApplicationModule is not found."
    May i know what went wrong? Am i missing anything. May i know how can i refresh the UI table based on the updated VORowImpl. Looking forward hints.
    Thanks

    Thanks Frank, Actually i am accessing the client method as follows,
    DCIteratorBinding itrBinding = findIteratorBinding("EmployeeVO1Iterator");
    if (itrBinding != null) {
    EmployeeVORowImpl employeeVO = (EmployeeVORowImpl)itrBinding.getCurrentRow();
    empoyeeVO.methodName();
    May i know what is the different between accessing the method as shown above and access via the MethodBinding?
    How about access the attribute of the View object using the above approach? i.e empoyeeVO.getName() like ....
    Also i have configure the ADF looger to finest level and found the primary reason to "Applicaiton module not found" was some entity validation. But this not always happened. Same test case works fine and fails sometimes. I am clueless. Could you please throw some light.
    Thanks
    Edited by: user1022639 on Feb 7, 2012 5:24 PM

  • Error: Creating a Connection Pool: issue with valid transaction levels

    Server: SunOS 5.8 Generic_117350-27 sun4u sparc SUNW,UltraAX-MP
    App Server: Sun Java System Application Server 8.2
    Jar: ojdbc14.jar
    Datasource Classname: oracle.jdbc.pool.OracleConnectionPoolDataSource
    Resource Type: javax.sql.ConnectionPoolDataSource
    [Issue]
    Using /SUNWappserver/lib/ojdbc14.jar, when I create my connection pool in Sun Java System Application Server 8.2, I return the following error each time I try to start the domain:
    [#|2006-07-28T14:53:56.169-0500|WARNING|sun-appserver-pe8.2|javax.enterprise.resource.resourceadapter|_ThreadID=11;|RAR5117 : Failed to obtain/create connection. Reason : The isolation level could not be set: READ_COMMITTED and SERIALIZABLE are the only valid transaction levels|#]
    When I change the value to "SERIALIZABLE" in my domain.xml file, I receive the following error in my server.log:
    Caused by: org.xml.sax.SAXParseException: Attribute "transaction-isolation-level" with value "SERIALIZABLE" must have a value from the list "read-uncommitted read-committed repeatable-read serializable ".
    So, I'm stuck!
    I can't use the values READ_COMMITTED or SERIALIZABLE since the valid values are "read-uncommitted read-committed repeatable-read serializable ".
    Case sensitivity matters.
    Any thoughts or help would be greatly appreciated.
    Thanks,
    --Todd                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    The Oracle JDBC accepts TRANSACTION_READ_COMMITTED and TRANSACTION_SERIALIZABLE isolation levels.
    Example:
    connAttr.setProperty(“TRANSACTION_ISOLATION”, “TRANSACTION_SERIALIZABLE”);
    I'd check with the Apps server folks.
    Kuassi - blog http://db360.blogspot.com/
    ------ book http://www.amazon.com/gp/product/1555583296/

  • Reporting Issue   "Opening Balance values are going to Not Assigned Values"

    Hi Friends,
    Closing Balance values are perfectly showing but opening balance values are going to Not Assingned Values? I am not able to find it out.
    our query is built on PCA Daily Multiprovider.
    Pls Help.
    Thanks
    Asim

    Hi Ravish,
    Here are the details
    Stock values and quantities in the PCA cube of BW never show correctly at an article level due to an opening balance issue.  This reduces our ability to report on stocks at moving average price, i.e. anything that actually matches the financial values in the system. The issue arises because stock at the beginning of the year (quantity and value) for any site is only shown against an article of "Not assigned."  Stock movements during the year are shown against the appropriate article.(If you have a new site, the will have stock assigned against an article up until 31 December and then the 31 December values are shown against "not assigned" for the next year).
    Thanks
    Asim

  • Dynamic sql reurns no data when multiple values are passed.

    (Dynamic sql returns no data when multiple values are passed.)
    Hi,
    While executing the below dynamic sql in the procedure no data is returned when it has multiple input values.
    When the input is EMPID := '1'; the procedure works fine and returns data.Any suggestion why the procedure doen't works when input as EMPID := '1'',''2'; is passed as parameter?
    =======================================================
    create or replace PROCEDURE TEST(EMPID IN VARCHAR2, rc OUT sys_refcursor)
    IS
    stmt VARCHAR2(9272);
    V_EMPID VARCHAR2(100);
    BEGIN
    V_EMPID :=EMPID;
    stmt := 'select * from TEST123 where Empid is NOT NULL';
    IF V_EMPID <> '-1' THEN
    stmt := stmt || ' and Empid in (:1)';
    ELSE
    stmt := stmt || ' and -1 = :1';
    END IF;
    OPEN rc FOR stmt USING V_EMPID;
    END Z_TEST;
    ============================================================
    Script for create table
    ==================================================================
    CREATE TABLE TEST123 (
    EMPID VARCHAR2(10 BYTE),
    DEPT NUMBER(3,0)
    ===========================================
    Insert into PDEVUSER.TEST123 (EMPID,DEPT) values ('1',20);
    Insert into PDEVUSER.TEST123 (EMPID,DEPT) values ('2',10);
    Insert into PDEVUSER.TEST123 (EMPID,DEPT) values ('3',30);
    Insert into PDEVUSER.TEST123 (EMPID,DEPT) values ('3',30);
    Insert into PDEVUSER.TEST123 (EMPID,DEPT) values ('2',10);
    =============================================
    Select * from TEST123 where Empid in (1,2,3)
    EMPID DEPT
    1     20
    2     10
    3     30
    3     30
    2     10
    ===================================================================
    Any suggestion why the procedure doen't works when input EMPID := '1'',''2';?
    Thank you,

    The whole scenario is a little strange. When I tried to compile your procedure it couldn't compile, but I added the missing info and was able to get it compiled.
    create or replace PROCEDURE TEST (EMPID IN VARCHAR2, rc OUT sys_refcursor)
    IS
      stmt        VARCHAR2 (9272);
      V_EMPID     VARCHAR2 (100);
    BEGIN
      V_EMPID := EMPID;
      stmt := 'select * from TEST123 where Empid is NOT NULL';
      IF V_EMPID = '-1' THEN
        stmt := stmt || ' and Empid in (:1)';
      ELSE
        stmt := stmt || ' and -1 = :1';
      END IF;
      OPEN rc FOR stmt USING V_EMPID;
    END;If you pass in 1 as a parameter, it is going to execute because the statement that it is building is:
    select * from TEST123 where Empid is NOT NULL and -1 = 1Although the syntax is valid -1 will never equal 1 so you will never get any data.
    If you pass in 1,2 as a parameter then it is basically building the following:
    select * from TEST123 where Empid is NOT NULL and -1 = 1,2This will cause an invalid number because it is trying to check where -1 = 1,2
    You could always change your code to:
    PROCEDURE TEST (EMPID IN VARCHAR2, rc OUT sys_refcursor)
    IS
      stmt        VARCHAR2 (9272);
      V_EMPID     VARCHAR2 (100);
    BEGIN
      V_EMPID := EMPID;
      stmt := 'select * from TEST123 where Empid is NOT NULL';
      stmt := stmt || ' and Empid in (:1)';
      OPEN rc FOR stmt USING V_EMPID;
    END;and forget the if v_empid = '-1' check. If you pass in a 1 it will work, if you pass in 1,2 is will work, but don't pass them in with any tick marks.

  • OBIEE 11g Report-When drilling down to lower level, totals are not matching

    Hi All,
    I am creating a report in Analytics 7.9.6.3 , OBIEE 11g - Let’s say Budget Cost/Actual Cost based on the Date Dimension
    Have selected the Fiscal Year Dim, Fact Budget Cost, Fact Actual Cost
    FY BCost ACost
    2011 100 120
    2012 150 140
    Total 250 260
    But here when I am drilling down from Year to Quarter and Period - The values are not matching with the totals....
    FY FY Qrt BCost ACost
    2011 2011 Q1 80 100
    2011 Q2 100 90
    2011 Q3 110 120
    2011 Q4 90 130
    Total 380 440
    Fiscal Year, Fiscal Quarter Budget Cost and Actual Cost are not returning correct results – When selecting cost's by fiscal year, or filtering on a specific fiscal year, the amount returned does not equal the fiscal year total;
    when I am drilling down to Quarter and Period Level the BCost & ACosts are bumping the Numbers as above and not matching with totals
    Any suggestions?

    Check out the query in both the times.Execute against database and try to sum the data for quarter/period and compare it with Year data.
    Just try to differentiate is it a report total that is getting wrong or the data in the database table itself is getting wrong stored.
    Mark as helpful if it helps.
    Regards,
    Veeresh Rayan

  • When drilling to lower level - Totals are not matching in BI Analytics OOB

    Hi All,
    I am creating a report in Analytics 7.9.6.3 , OBIEE 11g - Let’s say Budget Cost/Actual Cost based on the Date Dimension
    Have selected the Fiscal Year Dim, Fact Budget Cost, Fact Actual Cost
    FY BCost ACost
    2011 100 120
    2012 150 140
    Total 250 260
    But here when I am drilling down from Year to Quarter and Period - The values are not matching with the totals....
    FY FY Qrt BCost ACost
    2011 2011 Q1 80 100
    2011 Q2 100 90
    2011 Q3 110 120
    2011 Q4 90 130
    Total 380 440
    Fiscal Year, Fiscal Quarter Budget Cost and Actual Cost are not returning correct results – When selecting cost's by fiscal year, or filtering on a specific fiscal year, the amount returned does not equal the fiscal year total;
    when I am drilling down to Quarter and Period Level the BCost & ACosts are bumping the Numbers as above and not matching with totals
    Any suggestions?

    Check out the query in both the times.Execute against database and try to sum the data for quarter/period and compare it with Year data.
    Just try to differentiate is it a report total that is getting wrong or the data in the database table itself is getting wrong stored.
    Mark as helpful if it helps.
    Regards,
    Veeresh Rayan

  • These i7 4770K Temperatures And Voltage Values Are Normal With Z87-G43?

    Hello everyone,
    I recently upgraded my good old Core2Duo rig and bought a new cpu-motherboard-ram trio. My new specs is as follows:
    - i7 4770K @ stock speed + Coolermaster 212Evo cpu cooler
    - MSI z87 g43 motherboard
    - 2x4 GB G-Skill Ripjaws X 1866Mhz
    - Case: Corsair Carbide 400R
    The problem is I wasn't aware of that new Haswell cpus are running slightly hotter and now I'm little bit worried about my temperatures. Since I was also planning to overclock my cpu a bit and trying to find a point that doesn't need a vast voltage increase, I'm losing my sleeps over this situation at the moment.
    Anyways. I'm using softwares like HWmonitor, Coretemp, Realtemp and Intel Extreme Tuning Utility. Turbo Boost is also active. So the cpu is going up until 3.9 GHz. Other than that I'm on stock speeds, and motherboard's default values. Here are my temperatures:
    Ambient room temp: Varies between 23-25 °C
    Idle: 28-30 °C
    While playing demanding games like Battlefield 4: Max 58-65°C
    With Intel Extreme Tuning's stress test for 15 mins: max 65-70 °C
    With Prime 95 Blend and OCCT burn tests for 15 mins: max 78-82 °C
    I also run realtemp's sensor test and the values are identical since it's using Prime95 too.
    I also noticed that Prime 95 and OCCT is increasing my cpu voltage value from 1.156 to 1.21 while Intel Extreme Tuning's stress test and BurnInTest is using 1.156v. All these test are using %100 of the cores. Couldn't understand why there's a voltage increase on certain tests which leads my temps go even higher and higher. Will I encounter these kinds of random voltage increases during normal tasks? Like playing games, rendering some stuff etc..?
    On the other hand I tried motherboard's OC Genie future to see what happens. It overclocked the cpu automatically to 4.0 GHz @1.10v. With this setting I've seen max of 70 °C for a second and mostly 65-68°C under OCCT stress test. And also my voltage didn't increase at all and sit at 1.10. I'm a bit confused about these values. Since with default settings I'm getting hotter values and my voltage is going up to 1.21 with Turbo Boost under Prime95/OCCT burn tests. I also found out that my BIOS is v1.0. I don't really have a performance or a stability issue for now except this voltage thing. Does a BIOS update help on that situation? I don't really like to touch something already working OK and end up with a dead board.
    Also I'm wondering if my temperature values are normal with the cpu cooler i have (Coolermaster 212evo)?
    I also could buy some extra fans for my case (1 exhaust to top & 1 intake to side) and maybe a second fan for the cpu cooler if you guys think that these would help a bit.
    Sorry for my English by the way. I'm not a native speaker.
    Thanks for all your comments and suggestions already.

    Thanks for the reply Nichrome,
    I will follow your suggestions for the fans. Currently I don't have any fans on top. But I'm considering to buy some fans to top and side. So you get better results with top fans being exhaust right?
    Also which fan are you using as the second fan for the cpu heatsink? I would buy one of these as well since we have the same heatsink.
    I'm also using the default/auto voltages and settings at the moment. Just Turbo Boost is enabled and when it kicks in voltage is going up to 1.156. Which seems normal and doesnt produce dangerous level of heat. The thing is if I start running Prime95&OCCT the voltages going up to 1.21+ level at the same turbo boost speed (which is 3.9 GHz). And that produces a lot more heat than usual. But if I use BurnInTest or Intel Extreme Tuning Utility stress test the voltages sit at 1.156 and under full load on all cores. I'm wondering what's the reason of this difference and if it is software or motherboard related. Even with using the OC Genie @ 4.1 GHz temperatures and voltages seem lower than the stock&auto settings (idle 35-38 / stress test with OCCT 70C max, gaming 60-65C max). I'm not sure if a BIOS update would fix this. Since the whole BIOS flashing process is creeping me out. I don't like to bother with something that is already working OK. Don't want to end up with a dead board in the end :P Maybe I'm becoming a bit paranoid though, since this is a really hard earned upgrade after 6 years. :P

  • AET Generated field values are not saved.

    Hi Gurus,
    I have created two AET fields on the screen (marked below) and that should store values in table CRMD_CUSTOMER_H. When I create a new service request and enter the values and save, AET field values are not saved. Again, when I edit the same service request and enter the values and save , AET field values are saved on the database.
    While  debugging  I found that, relationship BTHeaderCustExt does not exist for the first time and second time onward its appearing. Due to this, data is not being saved at first time (Line no 27 : current is empty).
    When tried to create realtionship using create_related_entity , it throwing exception cx_crm_genil_model_error.
    Please advice me the soution for the same.
    Regards,
    Anand

    there should be a context node at your view level. Please check ON NEW FOCUS method is implemented or not.
    If not, you can implement that method with below code.
        DATA: lv_collection TYPE REF TO if_bol_bo_col,
              entity        TYPE REF TO cl_crm_bol_entity.
    *   get collection of dependent nodes
        entity ?= focus_bo.
        TRY.
            lv_collection = entity->get_related_entities(
                   iv_relation_name = 'BTHeaderCustExt' ).
            IF lv_collection IS NOT BOUND or lv_collection->size( ) = 0.
              IF entity->is_changeable( ) = ABAP_TRUE.
                TRY.
                    entity = entity->create_related_entity(
                     iv_relation_name = 'BTHeaderCustExt' ).
                  CATCH cx_crm_genil_model_error cx_crm_genil_duplicate_rel.
    *               should never happen
                ENDTRY.
                IF entity IS BOUND.
                  CREATE OBJECT lv_collection TYPE cl_crm_bol_bo_col.
                  lv_collection->add( entity ).
                ENDIF.
              ENDIF.
            ENDIF.
          CATCH cx_crm_genil_model_error.
    *       should never happen
            EXIT.
          CATCH cx_sy_ref_is_initial.
        ENDTRY.
        me->set_collection( lv_collection ).

  • Populating level values in Demantra.

    Hi Guru's,
    We have a requirement to create a new level on top of item level in item hierarchy. I took one of the exisiting level (t_ep_p2) and changed the its level title. The thing got reflected in the configure levels. I upgraded the data model and used level value and level value association dat file to populate the level member. The request went through without any errors but I could not see the same in staging tables. The level members are also getting reflected in the worksheets.
    Does any one faced this issue. Am I missing anything here?
    Thanks
    Ram

    Ram,
    Why are you using level value and level value association dat file in Demantra? As far as I know, they were meant for ODP, not for Demantra.
    I recommend to use custom_hook procedure (if you are using EBS-Demantra integration) OR direclty populate staging table (standalone Demantra).
    Follow the steps mentioned in implementation guide for Adding a new level.
    Thanks,
    Amit

Maybe you are looking for

  • Setting a bind variable in a Shared Component Report Query

    I have defined a Shared Component Report Query in APEX using a bind variable similar to below Select name, id , .... from asset where id = :id I have also created a Report Region that displays a table where one of the columns contains a value I would

  • Can't import iTunes purchased movie into iTunes

    Hi everyone, I downloaded a film (to rent) from the Apple Store, but for some reason, it didn't appear in my iTunes library, but rather in my iTunes media folder. When I try to add it to iTunes, it won't do anything. I want to watch it on my TV via A

  • BPS retraction part for Cost center accounting

    hi All         Can any body help me regarding the retraction part for cost center accounting bw-bps, how the retraction is exactly done . when part is where the data comes sits in bw and R/3 . how exactly plan data sits in r/3 what is the relation wh

  • Downloading windows support software through boot camp

    When i am in the boot camp assistant, it asks me to "download the windows support software for this Mac," however when I try to download it, i get a message that the windows software in not available. How do I make it avilable because I really need t

  • Directory on External allow others to see with password

    Hello Macbook Pro/Yosemite OSX. I have a TimeCapsule (which I use as an external HD) to store my movies and it is on our flat WIFI.  The TC is not being used for WIFI, rather we are using a WIFI/Router from the Telecom ISP.  The TC is setup in bridge