Should i increase PGA when PROCESS parameter is increased?

DB version:10gR2
Currently our PROCESSes parameter is set to 500 and PGA_AGGREGATE_TARGET set to 700mb. We are going to increase init.ora parameter PROCESSES to 1000. Should i be increasing PGA_GGREGATE_TARGET as well?

user659394 wrote:
DB version:10gR2
Currently our PROCESSes parameter is set to 500 and PGA_AGGREGATE_TARGET set to 700mb. We are going to increase init.ora parameter PROCESSES to 1000. Should i be increasing PGA_GGREGATE_TARGET as well?
Probably - but where are you going to get that memory from ?
When you set pga_aggregate_target, you are saying two things to Oracle. One viewpoint is that you are saying to that on average each process needs about N MB of memory (in your case about 1.4MB, which is a reasonable type of figure for an OLTP system). The alternative viewpoint is that you are saying to Oracle - ater startup and all other activity, this is the amount of memory available for Oracle processes so please ration it carefully.
If your viewpoint is the former, you need to set your pga_aggregate_target to something getting on for twice your current value. If your viewpoint is the latter, you can't change the pga_aggregate_target unless you get some more memory from somewhere else (e.g. the db_cache_size, or shared_pool_size).
Regards
Jonathan Lewis
http://jonathanlewis.wordpress.com
http://www.jlcomp.demon.co.uk
"The temptation to form premature theories upon insufficient data is the bane of our profession."
Sherlock Holmes (Sir Arthur Conan Doyle) in "The Valley of Fear".

Similar Messages

  • Need system should throgh message when process invoice

    Hi,
        I need system should throw warning message when an incoming invoice is processed via MIRO or  FB60 or F-43. Is it possible through validation.
    For example -  If vendor master is showing that vendor is blocked for payment and having indicator  'A' i.e blocked for payment, then when we process an invoice document through any of the above t.code, it should display warning message saying that vendor is blocked for payment.
    Pls let me know if can we provide the above message through standard validation rule for all the above transaction code.
    With regards,
    Rajesh Jain.

    HI
    use the transaction code OBA5 and use application area F5. Go to new entries and select message 671  with online as w -> warning and batch as w-> warning and save it. And now go to the mentioned transaction code by you and check. I will work for you
    Assign Points if it is working for you.
    Regards
    Vinay Bhaskar
    Edited by: vinay bhaskar on Apr 10, 2008 11:57 AM

  • Way to increase "MAXIMUM UPTIME PROCESSES" during preprocessing phase

    Hi All,
    It has always been frustrated when your ACT_UPG, SHADOW_IMPORT_INC phases are running very slow due to low number of MAX UPTIME PROCESSES enter earlier.
    I've been looking for a method to increase MAX UPTIME PROCESSES during upgrade, or shadow instance creation, but to no avail and guru here said there's no way to amend the Uptime Processes after you input the value during the preparation phase, but just to wait until it finish.
    However, i found out this can be done by changing the value for parameter mainimp_proc in *.TPP files (located under EHPI\abap\bin). eg of *.TPP files to be changed are based on the *.ECO located under EHPI\abap\tmp)
    For instance:
    youu2019ll see below command in TPAPP.ECO file:
    EXECUTING D:\usr\sap\EHPI\abap\exe\tp.exe pf=D:\usr\sap\EHPI\abap\bin\SHDUPGIMP1.TPP
    Therefore, you can increase the number of UPTIME processes by changing the value mainimp_proc in SHDUPGIMP1.TPP
    In order to change this, first, you need to kill the R3trans processes, and SAPehpi will prompt you the error message. take this opportunity to change the parameter. Make a backup before you change any files.
    To maintain consistency, i think is good to amend parameter mainimp_proc in every *.TPP based on the time it created when you launch the EHPI. Because, SAPEHPI will generated the value you enter into *.TPP file base on the global template. Also, check on every *.ECO file in /tmp for each phase to determine what *.TPP file is being read.
    example of .TPP files:
    SHDUPGIMP1.TPP, SHADOW.TPP, DEFAULT.TPP, TOOLIMPM.TPP, TOOLIMPI.TPP, and etc
    For your information, it is not recommend to change the processes during the upgrade and you should bear your own responsibility. The purpose of this post is just for sharing.
    For my situation, my colleague has insert value "2" for MAX UPTIME PROCESSES and it's veryslow due to only 2 R3trans is running and the clock is ticking for us. I change the parameter to "6" after some analysis done how EHP read the value. The good news is i can see 6 R3trans are running and 6 Support Packages are importing in /EHPI/abap/tmp.
    Time it takes to complete had shortened tremendously and upgrade completed without any error.
    Please provide any feedback on how it works if youu2019ve tried it or going to try it in future.
    Thanks,
    Nicholas.

    > Please provide any feedback on how it works if youu2019ve tried it or going to try it in future.
    Just to throw in my EUR 0.02:
    In most cases you do an upgrade not only one on a production system but also before on a test system - or at best - on a copy of the production.
    At the end of the upgrade you get an UPGEVAL.XML file that lists the runtime of the various phases and also the configuration.
    For such huge upgrades ('real' upgrades or e. g. EHP4) we do this upgrade several times with copies of the production (3+ TB database) and try to optimize the runtime for us plus avoiding most of the errors during activation and upgrade phases. For our EHP4 project I did a total of 6 upgrades in a copy of the production to find the optimum number of processes. This implies however, that resources are available (in sense of time and hardware) but we found out from the past, that this invest is of much more use than having a trembling administrator on a sunday evening sitting there and hoping that there'll be no restore of the database necessary.
    With todays database and/or storage technologies (snaphots/clones) it's no more a big administrative task to copy a system and reset it after the upgrade and start over if it was too slow.
    Out of my experience I would say, that I'd set the parameters always as high as the number of CPUs you have on the server executing the upgrade process, it's very rare that the machine comes to its CPU limit with nowadays CPU power, even if there's the production running.
    Markus

  • Increasing the actual process

    Hi presently i have standard manager with 3 actual processes .I want to increase to 5 so it will create 2 more background process and consume some additional resources for it ?.How can i incrrease it just modify the process parameter in the define form and bounce the concurrent manager or anything to do or any other impacts ?
    thnks
    shyam kumar

    Handling concurrent request is not only limited by the number of processes. If the concurrent Manager does not have enough processes defined to handle the number of jobs that waiting to run, there will always be jobs on hold. The Concurrent Manager parameters should be modified to handle more concurrent requests concurrently, this can be done in two steps:
    - Increase the Number of Target processes for the manager
    - Change the cache size of the concurrent manager as this determines how many requests will be evaluated by a manager at a time and should match the target (process) value as set above.
    The number of processes for the Manager can be increased in the Workshift Screen. The recommended value for "Cache Size" is at least twice the number of processes defined for the Manager. "Cache Size" controls the number of pending requests that the Manager picks up when it awakens from its Sleep Cycle.
    You can make use of balancing process workload by creating multiple work shifts for a concurrent manager to regulate the number of operating system processes that the manager starts up at different times of the day and different days of the week.

  • How to increase no. of processes

    Dear friends,
    I need to increase no. of processes.
    Kindly anybody tell me the proces how to icrease it?
    Thanx in advance.
    Prashant T.

    do you use a pfile or spfile, sysdba privileges are needed
    to tell:
    SQL> show parameter spfile
    NAME TYPE VALUE
    spfile string $ORACLE_HOME/dbs/spfilehmdw10.ora
    if you get a value u have spfile
    spfile:
    alter system set processes=151 scope=spfile;
    stop start db.
    pfile:
    edit your init.ora file parameter process=new value
    stop start db.
    Note: The default values of the SESSIONS and TRANSACTIONS parameters are derived from this parameter. Therefore, if you change the value of PROCESSES, you should evaluate whether to adjust the values of those derived parameters.
    Message was edited by:
    Slater

  • WLI problem when processing a high number of records - SQLException: Data e

    Hi
    I'm having some trouble with a process in WLI when processing a high number of records from a table. I'm using WLI 8.1.6 and Oracle 9.2.
    The exception I'm getting is:
    javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
    I think the problem is not with the table because it's pretty simple. I'll describe the steps in the JPD below.
    1) A DBControl checks to see if the table has records with a specific value in a column.
    select IND_PROCESSADO from VW_EAI_INET_ESTOQUE where IND_PROCESSADO = 'N'
    2) If there is one or more records, we update the column to another value (in other DBControl)
    update VW_EAI_INET_ESTOQUE  set IND_PROCESSADO = 'E' where IND_PROCESSADO = 'N'
    3) We then start a transaction with following steps:
    3.1) A DBControl queries for records in a specific condition
    select
    COD_DEPOSITO AS codDeposito,
    COD_SKU_INTERNO AS codSkuInterno,
    QTD_ESTOQUE AS qtdEstoque,
    IND_ESTOQUE_VIRTUAL AS indEstoqueVirtual,
    IND_PRE_VENDA AS indPreVenda,
    QTD_DIAS_ENTREGA AS qtdDiasEntrega,
    DAT_EXPEDICAO_PRE_VENDA AS dataExpedicaoPreVenda,
    DAT_INICIO AS dataInicio,
    DAT_FIM AS dataFim,
    IND_PROCESSADO AS indProcessado
    from VW_EAI_INET_ESTOQUE
    where IND_PROCESSADO = 'E'
    3.2) We transform all the records found to and XML message (Xquery)
    3.3) We update again update the same column as #2 to other value.
    update VW_EAI_INET_ESTOQUE set  IND_PROCESSADO = 'S'   where IND_PROCESSADO = 'E'.
    4) The process ends.
    When the table has few records under the specified condition, the process works fine. But if we test it with 25000 records, the process fails with the exception below. Sometimes in the step 3.1 and other times in the step 3.3.
    Can someone help me please?
    Exception:
    <A message was unable to be delivered from a WLW Message Queue.
    Attempting to deliver the onAsyncFailure event>
    <23/07/2007 14h33min22s BRT> <Error> <EJB> <BEA-010026> <Exception occurred during commit of transaction
    Xid=BEA1-00424A48977240214FD8(12106298),Status=Rolled back. [Reason=javax.ejb.EJBException: nested
    exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length  1.048.576 specified in create table.],numRepliesOwedMe=0,numRepliesOwedOthers= 0,seconds since begin=118,seconds left=59,XAServerResourceInfo[JMS_cgJMSStore]=(ServerResourceInfo[JMS_cgJMSStore]=(state=rolledback,assigned=cgServer),xar=JMS_cgJMSStore,re-Registered =
    false),XAServ erResourceInfo[weblogic.jdbc.wrapper.JTSXAResourceImpl]=(ServerResourceInfo[weblogic.jdbc.wrapper.JTSXAResourceImpl]=
    (state=rolledback,assigned=cgServer),xar=weblogic.jdbc.wrapper.JTSXAResourceImpl@d38a58,re-Registered =false),XAServerResourceInfo[CPCasaeVideoWISDesenv]=
    (ServerResourceInfo[CPCasaeVideoWISDesenv]=(state=rolledback,assigned=cgServer),xar=CPCasaeVideoWISDesenv,re-Registered = false),SCInfo[integrationCV+cgServer]=(state=rolledback),
    properties=({weblogic.jdbc=t3://10.15.81.48:7001, START_AND_END_THREAD_EQUAL=false}),
    local properties=({weblogic.jdbc.jta.CPCasaeVideoWISDesenv=weblogic.jdbc.wrapper.TxInfo@9c7831, modifiedListeners=[weblogic.ejb20.internal.TxManager$TxListener@9c2dc7]}),OwnerTransactionManager=ServerTM[ServerCoordinatorDescriptor=
    (CoordinatorURL=cgServer+10.15.81.48:7001+integrationCV+t3+,
    XAResources={JMS_FileStore, weblogic.jdbc.wrapper.JTSXAResourceImpl, JMS_cgJMSStore, CPCasaeVideoWISDesenv},NonXAResources={})],CoordinatorURL=cgServer+10.15.81.48:7001+integrationCV+t3+): javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.ejbStore(BMPContainerBean.java:1844)
            at com.bea.wli.bpm.runtime.ProcessContainerBean.ejbStore(ProcessContainerBean.java:227)
            at com.bea.wli.bpm.runtime.ProcessContainerBean.ejbStore(ProcessContainerBean.java:197)
            at com.bea.wlwgen.PersistentContainer_7e2d44_Impl.ejbStore(PersistentContainer_7e2d44_Impl.java:149)
            at weblogic.ejb20.manager.ExclusiveEntityManager.beforeCompletion(ExclusiveEntityManager.java:593)
            at weblogic.ejb20.internal.TxManager$TxListener.beforeCompletion(TxManager.java:744)
            at weblogic.transaction.internal.ServerSCInfo.callBeforeCompletions(ServerSCInfo.java:1069)
            at weblogic.transaction.internal.ServerSCInfo.startPrePrepareAndChain(ServerSCInfo.java:118)
            at weblogic.transaction.internal.ServerTransactionImpl.localPrePrepareAndChain(ServerTransactionImpl.java:1202)
            at weblogic.transaction.internal.ServerTransactionImpl.globalPrePrepare(ServerTransactionImpl.java:2007)
            at weblogic.transaction.internal.ServerTransactionImpl.internalCommit(ServerTransactionImpl.java:257)
            at weblogic.transaction.internal.ServerTransactionImpl.commit(ServerTransactionImpl.java:228)
            at weblogic.ejb20.internal.MDListener.execute(MDListener.java:430)
            at weblogic.ejb20.internal.MDListener.transactionalOnMessage(MDListener.java:333)
            at weblogic.ejb20.internal.MDListener.onMessage(MDListener.java:298)
            at weblogic.jms.client.JMSSession.onMessage(JMSSession.java:2698)
            at weblogic.jms.client.JMSSession.execute(JMSSession.java:2610)
            at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
            at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
    Caused by: javax.ejb.EJBException: nested exception is: java.sql.SQLException: Data exception -- Data -- Input Data length 1.050.060 is greater from the length 1.048.576 specified in create table.
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.doUpdate(BMPContainerBean.java:2021)
            at com.bea.wlw.runtime.core.bean.BMPContainerBean.ejbStore(BMPContainerBean.java:1828)
            ... 18 more

    Hi Lucas,
    Following is the information regarding the issue you are getting and might help you to resolve the issue.
    ADAPT00519195- Too many selected values (LOV0001) - Select Query Result operand
    For XIR2 Fixed Details-Rejected as this is by design
    I have found that this is a limitation by design and when the values exceed 18000 we get this error in BO.
    There is no fix for this issue, as itu2019s by design. The product always behaved in this manner.
    Also an ER (ADAPT00754295) for this issue has already been raised.
    Unfortunately, we cannot confirm if and when this Enhancement Request will be taken on by the developers.
    A dedicated team reviews all ERs on a regular basis for technical and commercial feasibility and whether or not the functionality is consistent with our product direction. Unfortunately we cannot presently advise on a timeframe for the inclusion of any ER to our product suite.
    The product group will then review the request and determine whether or not the functionality/feature will be included in a future release.
    Currently I can only suggest that you check the release notes in the ReadMe documents of future service packs, as it will be listed there once the ER has been included
    The only workaround which I can suggest for now is:
    Workaround 1:
    Test the issue by keep the value of MAX_Inlist_values parameter to 256 on designer level.
    Workaround 2:
    The best solution is to combine 'n' queries via a UNION. You should first highlight the first 99 or so entries from the LOV list box and then combine this query with a second one that selects the remaining LOV choices.
    Using UNION between queries; which is the only possible workaround
    Please do let me know if you have any queries related to the same.
    Regards,
    Sarbhjeet Kaur

  • Error in BPM - Error when processing node '0000000065' ParForEach index

    Hi All,
    I have an issue .. I have done 1:n mapping successfully and would like to place the Send step in loop instead of a Block .. The reason being I have the count of how many times the Send step should executed for multiline.. I need to receive an Acknowledgement for each send step.. If I dont receive the required number of Acknowledgements then .. I need to revert back some Creations which is a business requirement ..
    So .. I have initialized a container operation variable i to '0' .. Then the loop condition is i < count .. Send the multi-container .. Receive the Response .. If dont receive the desired response for any one of the multi-line then I need to do a cancellation process in a loop again .. So .. now I am getting an Exception "Error when processing node '0000000065' (ParForEach index 000000)
    Message no. SWP088" in the loop step ..
    It is fine if somebody can suggest alternate logic can be applied as well but first preference to use a loop which consumes lesser system resources ..
    Kindly look into the Issue
    Regards,
    Raj
    Edited by: raj2112 on Sep 21, 2010 2:34 PM

    if you use a loop step, you will send one message per time. ussing block step you have parallel processing ussing ParForEach. now is for  X reason the item cannot be created, do a roll back in the target system. the problem here you will do the rollback once the last message reach.
    the other posibility is to handle application ack in the sender step. it will let you know it the message was processed success and this ack could be the end condition of your block step. but you cannot use this with loop step.
    take a look to this
    http://help.sap.com/saphelp_nw04/helpdata/en/55/65c844539349e9b1450581ab44a5e6/frameset.htm

  • Error when processing Network

    Dear all.
    I create a sales order in tcode va01 and input a wbs element E-000009.1 in the field of WBS Element at Account assignment tab of one item detail.When save the sales order,sap pop up a message box with text "Sales order has no CO object".The detail message content was pasted at bottom.When confirm the first message box,sap pop up the second message box with text "Error when processing Network".The detail message content was pasted at bottom.
    So I need the experts give me some useful advice and reference to resolve the problem.
    Thank you.
    Regards
    Yoda
    Sales order has no CO object
    Message no. CO323
    Diagnosis
    The production order should be settled on the sales order.
    The requirements type of the sales order, however, refers to an account assignment category that does not allow settlement using the sales order (key consumption posting).
    Procedure
    1. Check which requirements type in entered in the relevant sales order item
    Display sales order
    2. Check the settings for this requirements type in Customizing
    (See Control sales order-related production).
    Error when processing Network
    Message no. V1380
    Diagnosis
    A technical error has occurred. On calling up the assembly interface, exception 2 was triggered. The exceptions have the following meanings:
    1 = External block
    2 = General error
    3 = Insufficient data for the interface
    4 = Order was not found
    5 = Update has been rejected
    6 = Final document number for Network is not issued
    Procedure
    Inform your system administrator.

    Dear shashank.
    Thanks for you reply.
    1.I have checked the sales order and the requ. type was KMNP.In tran OVZH the requ. type KMNP use the ReqCl 202.In tran OVZG I checked the ReqCl 202 and the "Acct.assig.cat." was Q and the Consumption was P.
    2.In tran cj20n I checked the wbs element.Its operative indicator was billing elements.
    3.In tran mm03 I try to found the field of "req type / strategy".But I could not this field.Could you tell me the "req type / strategy" field was in which view?
    Thank you so much.
    Regards
    Yoda

  • Error for the fact table while processing the cube - attribute key cannot be found when processing

    Please help as I am new to SSAS and this is urgent requirement. This is a MOLAP cube and below is the error that I am receiving when processing the cube. The cube is set to Prrocess Full. Several similar errors are popped up for various dimensions.
    "Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'Fact_Table', Column: 'ID', Value: '1'. The attribute is 'Id'. Errors in the OLAP storage engine: The attribute key was converted to an unknown member because
    the attribute key was not found. Attribute Id of Dimension: 17 - Ves - PoC Cont from Database: DB, Cube: IPNCube, Measure Group: iSrvy, Partition: Partition1, Record: 1."
    Thanks in advance.

    Thanks for the recommendations David.
    It will be really great if you can clear some of my doubts:
    To my information, all the dimensions need to be processed first and then the fact table will be processed.
    So if the ID's are not present in the dimension tables, then it should not be present in the Fact table either.
    Here we found null values in the dimension table and the ID's were present in the Fact table. What might be the reasons causing such situation?
    Also how frequently the cube needs to be processed? Currently the ETL which processes the cube, is scheduled in a SQL Job Agent on hourly basis everyday. 
    Is there any possibilty that the cube might be under processing state and the SQL job for the next run getting executed trying to access and process the cube while it was still processing?

  • Error FS206 Down pmnts w/ taxes not permitted when processing with jur.code

    Issue:   I am encountering a problem entering Taxes when applying  down payments received from customers since we are using a jurisdiction code taxation procedure.  This process works as desired in Europe for our firm.  I found OSS 97288 which is not valid for version ECC 6.0.  Please assist due to US go-live in 2 weeks.Process:  Create Down Payment Request for cash in advance.  Required to tax at this point.  Billing document creates a noted item in the G/L (Special G/L).  I apply customer's payment via transaction F-29 and receive error "FS206 Down pmnts with taxes are not permitted when processing with jur.code."
    Settings:  1) Alternative Recon account for Special GL posting has The tax category on the GL account = B "Output tax - down payments managed gross."  2) Transaction OBXB specifies tax clearing account for transaction key MVA.  This G/Laccount has same tax category setting (B). My settings work perfectly for Europe since they are not taxed via juridicition codes.
    Thank you in advance!

    Check the GL master of the alternative recon account. Ideally there should be no setting in the tax fields as there is no tax calculated on down payment but only on the final amount. Leave the tax fiel blank and do not check mark posting without tax allowed.

  • Execute subroutine only when selection parameter changes

    Hi ABAP workers,
    I have a block of selection parameters, and I created the event AT SELECTION SCREEN ON BLOCK bl1 with a subroutine. I want that subroutine to be executed only when a parameter in the selection block is changed by the user. But the behaviour right now is that it executes every time I press Enter, even if no parameter changes.
    Is there any way to chieve this (like in the module pool case, with the extension "ON REQUEST")?
    Thank you very much
    Ivson
    Code involved:
    SELECTION-SCREEN BEGIN OF BLOCK bl1 WITH FRAME TITLE text-001.
    PARAMETERS: p_bukrs LIKE csks-bukrs MEMORY ID buk OBLIGATORY,
                             p_ryear LIKE glpct-ryear OBLIGATORY.
    SELECT-OPTIONS: s_poper  FOR glpct-rpmax,
                                  s_racct  FOR glpct-racct,
                                  s_kunnr  FOR glpca-kunnr,
                                  s_lifnr  FOR glpca-lifnr,
                    s_sprctr FOR glpct-sprctr.
    SELECTION-SCREEN END OF BLOCK bl1.
    AT SELECTION-SCREEN ON BLOCK bl1.
      PERFORM preselect.

    Hi,
    You could try this.
    Use the FM "DYNP_VALUES_READ" to get the contents of that screen parameter and then check for the parameter value inside the subroutine using a IF statement.
    PNAME is a paramter name here.
    a  dynpfields-fieldname  = 'PNAME'.
      append dynpfields.
      repid = sy-repid.
      call function 'DYNP_VALUES_READ'
           exporting
                dyname     = repid
                dynumb     = sy-dynnr
           tables
                dynpfields = dynpfields
           exceptions
                others.
      read table dynpfields index 1.
      pname = dynpfields-fieldvalue.
    Process the subroutine if needed based on the check condition.
    Hope this helps you.
    Regards,
    Subbu

  • A duplicate attribute key has been found when processing

    Hi
    When I process one of my dimensions it fails and I get the following error:
    Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: 'Customers', Column: 'DisplayName', Value: 'Stephen Grant'. The attribute is 'Display Name'.
    I don't know if this is significant, but the attribute to which it is making reference was added through BIDS 2008 (the cube was originally created with BIDS 2005). 
    There are no duplicates of 'Stephen Grant' in the DisplayName column. Not that it should matter if there were as this attribute has a cardinality of Many, with an rigid attribute relationship directly to the dimension's key attribute. The Key column for the Display Name attribute simply refers back to the same (DisplayName) column in the table.
    If I delete this record, or even just update the DisplayName field from 'Stephen Grant' to something else, the dimension processes just fine. I can't work out what it is about this record that is stopping the dimension from being able to process.
    Can anyone help me figure out what's going on?
    Julia.
    P.S. I am using SSAS 2008 on Windows Server 2008

    Hello there,
    Does anybody has the complete resolution for duplicate entries data. I am showing the data below .
    RegionName
    BranchName
    LoanOfficerFullName
    PRIME - TEAM NORTHEASTERN
    CALIFORNIA
    ALONA  HAYES
    PRIME - TEAM WESTERN
    RENO-DAMONTE RANCH
    VINCENT LOTITO
    PRIME - TEAM WARTON
    NEWARK (Chicago)
    CHRISTOPHER CLIFTON
    PRIME - TEAM WARTON
    GRAND HAVEN (West Michigan)
    SEAN  FOLEY
    PRIME - TEAM SALMANS
    LAWTON (Amarillo)
    THOMAS  STEARNS
    PRIME - TEAM BARTON
    CALIFORNIA
    ALONA  HAYES
    PRIME - TEAM ROBINSON
    LAS VEGAS
    ANGELA DEATON
    PRIME - TEAM ROBINSON
    LAS VEGAS
    HERMAN  VANDER VELDT
    PRIME - TEAM ROBINSON
    LAS VEGAS
    DAWN ROBINSON
    PRIME - TEAM ROBINSON
    LAS VEGAS
    MICHAEL  BIRK
     Here i have the Hierachies in order LoanOfficerFullName-->BranchName-->RegionName. Loan Officer is the lowest level. I had created another column and make that unique entries so that I could make the unique column as the Key and the hierachies i have linked with the key column value. The issue
    is that i am getting the expected result in OLAP  from data side. But the data is not getting aggregated like Team Barton is coming twice though for the Dimension I make IsAggregatable property to True. If anybody can help me I will aprecciate. If any other clarification need I can provide.

  • Errors in the OLAP storage engine: The attribute key cannot be found when processing

    this is the absolute worst error message in all of computing.  I despise it.  Here is my situation.
    SSAS 2008 R2.
    I have one dimension.  I have not even built my cube yet.  only a dimension.  I am trying to process it.  I can process it when I only have a single attribute, they key.  it is a composite key.  When I add a new attribute (integer),
    I get the error message.  There are no null values.  There are no blanks as its an integer. 
    The attribute key cannot be found where?  I'm processing the dimension you idiot.  there is not even a cube yet in order for any key to be found or not.

    Hi Baracus,
    According to your description, you get the error "Errors in the OLAP storage engine: The attribute key cannot be found when processing" when processing your cube, right?
    Generally, the detail error message should like
    Table: 'dbo_FactSales', Column: 'ProductID', Value: '1111'. The attribute is 'Product ID'
    The above error explains that the fact table named "FactSales" contains column ProductID with value "1111" but the same  ProductID  is not present in your dimension table. There is a primary key - foreign key relationship exist
    between the ProductID column of dimension table and fact table named "FactSales" and cube is unable to find ProductID with value 1111 in the dimension table.
    At this time, what we need to do is to check either your dimension and fact table contains the value mentioned in the error message (  Value: '1111' in the above example). Here are some links about troubleshoot this issue, please see:
    http://www.businessintelligence-solutions.com/ssas-typical-error-attribute-key-processing/
    http://www.youtube.com/watch?v=5O7IAjvtAF4
    If this is not what you want, please provide us more information about you issue, so that we can make further analysis.
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    Charlie Liao
    TechNet Community Support

  • Errors in the OLAP storage engine: A duplicate attribute key has been found when processing

    Hi dear MSDN Community,
    I am facing a problem while processing a cube with a customer hierarchy as follows:
    Global Account --> Main Customer --> Master Customer --> Customer
    The data comes from a flatted parent child table, that is, I create an extra column for every level of the hierarchy in the customer view. If a level is empty, then the value is filled with the previous value. Then I can use the property:
    HideMemberIf = OnlyChildWithParentName for the intermediate levels (Main and Master Customer)
    HideMemberIf = ParentName For the leafs (Customer)
    HideMemberIf = never for the root (Global Account)
    Consider this example:
    Then, for the root level I am using as the key the fields in yellow in order to avoid duplicates. However, I am getting the error message "Errors in the OLAP storage engine: A duplicate attribute key has been found when processing" while processing.
    I analyzed the query that SSAS issues to the server (select distinct ....) and I think it should work but it is still failing.
    I had similar problems with the intermediate levels but I was able to solve it using a similar procedure.
    Any help will be appreciated.
    Kind Regards.

    When are you having this error? While processing the dimension or during cube processing?
    http://blog.oraylis.de/2013/08/a-duplicate-attribute-key-has-been-found-during-processing-revisited/
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Errors in OLAP storage engine when processing application

    Hi
    After changing a logic in the application, we processed the application but it did not complete successfully due to the following error:
    Error message:: CreateOLAPCubeForApplication:CreateCube:Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'dbo_tblFactQuality', Column: 'QACCOUNT', Value: '%ACCOUNTS%'. The attribute is 'QAccount_ID1'.
    When trying other applications they showed errors with reference to the measure group, e.g.:
    Error message:: Errors in the metadata manager. No dimension relationships exist within the 'Ownership' measure group.
    Thanks for your help.
    Melanie

    Hi,
    This kind of error will come when you have some invalid member defined in your fact table. you can use the following SQL query to check it.
    select * from tblfact"yourapplication" where "dimension" not in (select id from mbr"dimension" where calc = 'N')
    The same query needs to be run in wbtable and fac2 table.
    These selection should return 0 records.
    If it is returning something then you have to delete these records (replace "select * "with delete).
    You can run this for all the applications and all the dimensions. However, I believe, you can check in your Quality application and the QACCOUNT dimension.
    You can see for which dimension, there is an invalid member. This member might be getting created through one of your script logics.
    Hope this helps.

Maybe you are looking for

  • Need advice for SAP installation on External  Hard disk

    Hi Gurus, I have HP laptop with configuration of 512 MB RAM and 50 GB hard disk...i recently bought 250 GB segate external  hard disk for SAP installation...My doubt here is if i  install SAP in external hard disk and will it work properly if i conne

  • SAP NetWeaver BI 7 with BW level 19 ?

    I have BOE XI 3.1 installed and want to install the SAP Integration Kit. Is it a problem if there is a leve 19 on BI 7? Is it a problem to have a 64-bit Windows OS ? Any answer will be appreciated. Bye for now Birte

  • How do I open iCloud files on my iPhone?

    I have some PDF's on my iCloud drive that I put there on my desktop computer. Cant access them on my iPhone? Pages saves docs in and independent folder that it directly accesses. I can't seem to find an app that lets me navigate the iCloud drive on m

  • Using pen tool, no fill no stroke but still colored line?

    I have used the pen tool to draw shapes in Illustrator. Some shapes have normal paths and some are weird. The normal paths disappear when they aren't clicked on, the weird ones have a green stroke always. There is no fill and no stroke on so I am con

  • Im confused about my plan change soon because of this

    OK i spoke with an instore rep about making changes to my calling plan on Saturday and the rep told me if i were to make changes to my plan i would no longer have the connect plan (which is what i have currently), then i call Verizon yesterday over t