BAPI_REQUIREMENTS_CHANGE, Requirement blocked .........

Hi gurus,
   I done BAPI_REQUIREMENTS_CHANGE, it get error "Requirement blocked", and now I come MD61, that material's error "Requirement blocked". What happend !!! And pls tell me how to fix ... and every thing about BAPI_REQUIREMENTS_CHANGE ...
  Thanks for all ........

Hai,
Use the below code after calling your BAPI:
data ls_return like BAPIRET2.
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
Exporting
            WAIT = 'X'
Importing
            RETURN = ls_return.
if ls_return-type <> 'E'.
"  success
else.
" failed
endif.
  CALL FUNCTION 'DEQUEUE_ALL'.                  " Releases locked objects

Similar Messages

  • Error- Requirements blocked while demand transfer

    Hello Experts,
    I am transferring demand from DP to ECC using a transfer profile.
    While the transfer happens, I am getting an error on the SCM queue manager as ' Requirements blocked' error code 6P012.
    Requirement strategy is correctly maintained in the material master in ECC.
    After analysing the function module REQUIREMENTS_MAINTAIN_RELEVANC, we understood that there is a check for 'classification' tab in material master. If the status is not right (green), this will result in data blockage.
    However even with the classification tab well maintained, we are facing this issue.
    I have tried to search SAP notes on the error code and the issue, but not of much help.
    Any pointer would be of great help.
    Regards
    Manotosh

    Hi All,
    We got the solution. The issue is in the 'classification' details in the ECC material master. It should be all green in the 'classification' tab and the aggregation level for the DP job should be at product + location level.
    Regards
    Manotosh

  • Unable to Go to Required block

    Hi,
    I want to pass the required block name via character variable to GO_BLOCK builtin but it does not work if I hard code the same block name it works fine
    for instance
    GO_BLOCK(:global.block_name); --Does not work and does not generate any error
    GO_BLOCK('yearly_exp_table'); --This works fine
    is there anything wrong how can I deal this situation.
    Thanks, Khawar.

    Hi,
    Basically I have three blocks one is control and two are database datablock I had used POST-BLOCK trigger to find cursor position in a block and then use Go_block and Go_Record built in to delete that record but it does not works as I expect I have write the same code in WHEN-MOUSE-CLICK trigger and now its works fine, if you have any suggestion plz give me.
    Thanks, Khawar

  • Requirement blocked

    Hi,
              I am transferring my forecast from DP to R/3. I scheduled the job successfully and the spoll says successful.But I see no requirements in MD63. I checked the outbound que of APO in SMQ1 and found the following block.
    The block says "SYSFAIL" and when I checked the details, it says "Requirement blocked".
    Any idea what this could mean?
    Thanks.

    HI Raj,
    The requirements type that determines the planning strategy of the products in R/3. An entry in this field is optional.
    You do not need to make an entry in this field if the product master records in R/3 contain an MRP group and if each of these MRP groups has been assigned a strategy group in R/3 Customizing. The system determines the requirements type from the main planning strategy that is contained in each strategy group.
    You can also enter the requirements type from an alternative planning strategy in the same strategy group.
    let me know ur settings for products in R/3..
    regards
    umamahesh

  • Block creation problem in member formula

    Hi,
    I have a member formula as below :
    IF(.....)
    x = a/b;
    ENDIF
    where a,b have their own member formulas.
    There is no issue with the calculations of x,a or b, but I am facing an issue with block creation of x.
    Is there a way to handle block creation through member formulas? I know the way out through calc scripts, but i am trying to limit the use of calc scripts.
    Thanks!
    Note:The storage property of all three members is store. Also, a and b lie in the same intersection (same block), and x lies in another intersection.

    In the following section of the dbag: http://docs.oracle.com/cd/E12825_01/epm.111/esb_dbag/frameset.htm?dcaoptcs.htm...
    Under the sub-section: In Equations in a Dense Dimension...
    It states as follows:
    In Equations in a Dense Dimension
    When you use a cross-dimensional operator in an equation in a dense dimension, Essbase does not automatically create the required blocks if both of these conditions apply:
    Resultant values are from a dense dimension.
    The operand or operands are from a sparse dimension.
    You can use the following techniques to create the blocks and avoid the performance issue.
    Ensure that the results members are from a sparse dimension, not from a dense dimension. In this example, the results member Budget is from a sparse dimension:
    FIX(Sales)
    Budget = Actual * 1.1;
    ENDFIX
    FIX(Expenses)
    Budget = Actual * .95;
    ENDFIX
    Use the DATACOPY calculation command to create and then calculate the required blocks. See Using DATACOPY to Copy Existing Blocks.
    Use a member formula that contains the dense member equations:
    FIX(Sales, Expenses)
    Budget (Sales = Sales -> Actual 1.1;*
    Expenses = Expenses -> Actual .95;)*
    ENDFIX
    The phrasing of that last part would lead me to believe that block creation via a member formula is possible. Though I'm not sure of the member to which that formula should be applied. Furthermore, the code looks funky in that we're fixing on Sales and fixing on Expenses, and them on the left side of the equation within the fix.
    Thoughts anyone?

  • Delivery block in the schedule lines.

    Hi,
    Mine is a MTO scenario. I am using planning strategy 50. So a planned order ---> production order will be created against MRP run  against each sales order.
    When i am placing a delivery block at the header the system is working fine. Its deleting the planned order(converted production order).
    when I am placing a delivery block at the schedulle line its not working fine. its not deleting the underlined planned order or production order.
    what configuration should I do to make this work.......

    Hi,
    Look at the help (press F1) on  the field confirmation block at SPRO-> Sales and Distribution ->Basic functions ->Availability check and Transfer of Requirements ->Tranfer of Requirements -> Block quantity confirmation in delivery blocks ->Reasons for and Scope of Delivery Blocks: Transfer of Req. blocks.
    It looks applying delivery block at schdule line level will not delete the passed requirements. This is standard SAP behavior. If you want to change this nature, then you may need to use an User-exit.
    Following is description given for the field confirmation block;
    Confirmation block
    Indicates whether the system, in addition to blocking delivery, also blocks the confirmation of order quantities after an availability check during sales order processing.
    Example
    You may want to block confirmation of sales orders where the creditworthiness of the customer is in question. In this case, you set the confirmation block for the delivery block that relates to credit problems. During sales order entry, when you enter a delivery block because of credit problems, the system, after you save the order, does not confirm any quantities for delivery. In this way, the goods remain available for other customers.
    Note
    If, before you save a sales order, you look at the schedule line overview, you can see what the system would confirm, if the block were not set. However, as soon as you save the sales order, the confirmed quantities are automatically reset to zero.
    Dependencies
    If you enter the delivery block at header level, the system transfers the desired delivery quantity for all schedule lines in the requirement. The confirmed quantity is deleted. This function is not available when you enter the delivery block at schedule line level.
    Regards

  • Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).

    Hi,
    Our Environment is Essbase 11.1.2.2 and working on Essbase EAS and Shared Services components.One of our user tried to run the Cal Script of one Application and faced this error.
    Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
    I have done some Google and found that we need to add something in Essbase.cfg file like below.
    1012704 Dynamic Calc processor cannot lock more than number ESM blocks during the calculation, please increase CalcLockBlock setting and then retry (a small data cache setting could also cause this problem, please check the data cache size setting).
    Possible Problems
    Analytic Services could not lock enough blocks to perform the calculation.
    Possible Solutions
    Increase the number of blocks that Analytic Services can allocate for a calculation:
    Set the maximum number of blocks that Analytic Services can allocate to at least 500. 
    If you do not have an $ARBORPATH/bin/essbase.cfg file on the server computer, create one using a text editor.
    In the essbase.cfg file on the server computer, set CALCLOCKBLOCKHIGH to 500.
    Stop and restart Analytic Server.
    Add the SET LOCKBLOCK HIGH command to the beginning of the calculation script.
    Set the data cache large enough to hold all the blocks specified in the CALCLOCKBLOCKHIGH setting. 
    Determine the block size.
    Set the data catche size.
    Actually in our Server Config file(essbase.cfg) we dont have below data  added.
    CalcLockBlockHigh 2000
    CalcLockBlockDefault 200
    CalcLockBlocklow 50
    So my doubt is if we edit the Essbase.cfg file and add the above settings and restart the services will it work?  and if so why should we change the Server config file if the problem is with one application Cal Script. Please guide me how to proceed.
    Regards,
    Naveen

    Your calculation needs to hold more blocks in memory than your current set up allows.
    From the docs (quoting so I don't have to write it, not to be a smarta***:
    CALCLOCKBLOCK specifies the number of blocks that can be fixed at each level of the SET LOCKBLOCK HIGH | DEFAULT | LOW calculation script command.
    When a block is calculated, Essbase fixes (gets addressability to) the block along with the blocks containing its children. Essbase calculates the block and then releases it along with the blocks containing its children. By default, Essbase allows up to 100 blocks to be fixed concurrently when calculating a block. This is sufficient for most database calculations. However, you may want to set a number higher than 100 if you are consolidating very large numbers of children in a formula calculation. This ensures that Essbase can fix all the required blocks when calculating a data block and that performance will not be impaired.
    Example
    If the essbase.cfg file contains the following settings:
    CALCLOCKBLOCKHIGH 500  CALCLOCKBLOCKDEFAULT 200  CALCLOCKBLOCKLOW 50 
    then you can use the following SET LOCKBLOCK setting commands in a calculation script:
    SET LOCKBLOCK HIGH; 
    means that Essbase can fix up to 500 data blocks when calculating one block.
    Support doc is saying to change your config file so those settings can be made available for any calc script to use.
    On a side note, if this was working previously and now isn't then it is worth investigating if this is simply due to standard growth or a recent change that has made an unexpected significant impact.

  • Block MRP for Sale order

    HI friends,
    My client has following requirement.
    we are using MTO scenario.after creation of sale order these SO is checked by senior sale manager then he approved the sales order after that it is possible to start run MRP.How I can achieve this ?
    REgards,
    Chetan.

    Dear Chetan
    Please check SPRO->Sales and Distribution->Basic Functions->Availability Check and Transfer of Requirements->Transfer of Requirements->Block Quantity Confirmation In Delivery Blocks
    There in second selection Reasons for and Scope of Deliv.Blocks: Transfer of Req.Block
    Where you can put confirmation block for particular delevery block.
    It says
    Block Quantity Confirmation In Delivery Blocks
        When requirements are transferred to MRP, the confirmed quantity is also  reserved for confirmed sales documents. If a transaction is blocked for delivery, the required stock will be blocked so it cannot be used elsewhere. To prevent this, you can block the transfer of requirements for a delivery block in this step.
        In this case, the ordered quantity will still be transferred to MRP as a     requirement but the quantity will not be reserved. This is apparent in     the document when no confirmed quantities are available after saving    
        When the block is removed, the system automatically carries out an availability check.
    So in sales order you can have deliery block either in VOV8, it can also be enterd in VA02 by edit->Fast Change-> Delivery block. and along with Confirmation check box requirements will not be transferred
    Please check and confirm it is as per your requirement?
    Regards
    Jitesh

  • Slow calc time with SET CREATEBLOCKONEQ OFF for block creation

    Hello everyone,
    I have a problem with the slow execution of one of my calc scripts:
    A simplified version of my calc script to calculate 6 accounts looks like this:
    SET UPDATECALC OFF;
    SET FRMLBOTTOMUP ON;
    SET CREATEBLOCKONEQ ON;
    SET CREATENONMISSINGBLK ON;
    FIX (
    FY12,
    "Forecast",
    "Final",
    @LEVMBRS("Cost Centre",0),
    @LEVMBRS("Products",0),
    @LEVMBRS("Entities",0)
    SET CREATEBLOCKONEQ OFF;
    "10000";"20000";"30000";"40000";"50000";"60000";
    SET CREATEBLOCKONEQ ON;
    ENDFIX
    The member formula for each of the accounts is realtively complex. One of the changes recently implemented for the FIX was openin up the cost center dimension. Since then the calculation runs much slower (>1h). If I change the setting to SET CREATEBLOCKONEQ ON, the calculation is very fast (1 min). However, no blocks are created. I am looking for a way to create the required blocks, calculate the member formulas but to decrease calc time. Does anybody have any idea what to improve?
    Thanks for your input
    p.s. DataStorage in the member properties for the above accounts is Never Share

    MattRollings wrote:
    If the formula is too complex it tends not to aggregate properly, especially when using ratios in calculations. Using stored members with member formulas I have found is much faster, efficient, and less prone to agg issues - especially in Workforce type apps.We were experiencing that exact problem, hence stored members^^^So why not break it up into steps? Step 1, force the calculation of the lower level member formulas, whatever they are. Make sure that that works. Then take the upper level members (whatever they are) and make them dynamic. There's nothing that says that you must make them all stored. I try, wherever possible, to make as much dynamic as possible. As I wrote, sometimes I can't for calc order reasons, but as soon as I get past that I let the "free" dense dynamic calcs happen wherever I can. Yes, the number of blocks touched is the same (maybe), but it is still worth a shot.
    Also, you mentioned in your original post that the introduction of the FIX slowed things down. That seems counter-intuitive from a block count perspective. Does your FIX really select all level zero members in all dimensions?
    Last thought on this somewhat overactive thread (you are getting a lot of advice, who knows, maybe some of it is good ;) ) -- have you tried flipping the member calcs on their heads, i.e., take what is an Accounts calc and make it a Forecast calc with cross-dims to match? You would have different, but maybe more managable block creation issues at that point.
    Regards,
    Cameron Lackpour

  • WebLogic Portal (10.3.0) requires feature "com.m7.nitrox (1.0.20)" (?)

    (I'm re-posting as WebLogic Portal (10.3.0) requires feature "com.m7.nitrox (1.0.20)" (?))
    Greetings. I just installed WebLogic Portal 10.3 and I'm trying to install a couple of plug ins to Workshop/Eclipse, but whenever I select any item to install I get this error:
    WebLogic Portal (10.3.0) requires feature "com.m7.nitrox (1.0.20)", or compatible.
    Everything else seems to be working fine, so I'm not sure why my installation is giving me this issue. I've done a bit of digging already and haven't found what may be causing this issue, and I'm hoping someone here may be able to point me in the right direction.
    Thanks!

    This is a known issue with WLP 10.3; it has been addressed for the next release of WLP. You could contact support to try to get a patch created (reference CR379999).
    I can think of 2 possible workarounds:
    1. Download the new plugin manually, and either a) create an extension location on the file system from it and add that via Help|Software Updates|Manage Configuration, or b) extract it to one of the workshop eclipse folders (i.e. tools/eclipse_pkgs/2.0/eclipse_3.3.2, tools/eclipse_pkgs/2.0/pkgs/eclipse, workshop_10.3/workshop4WP/eclipse, wlportal_10.3/eclipse) and restart Workshop.
    2. Edit <beadir>/wlportal_10.3/eclipse/features/com.bea.wlp_10.3.0/feature.xml, comment out the <import feature="com.bea.*"/> lines in the <requires> block, restart Workshop, and try again.
    Greg

  • Sales Order Blocked for MRP

    Dear All
    i hav a scenario for make to order where sales order created should default block for MRP,only authorized person should release the block
    Please suggest how to do this
    Regards

    Hi Sugunan,
    You can use default delivery blocks for your requirement.
    IMG - Sales and Distribution - Basic Functions - Availability Check and Transfer of Requirements - Transfer of Requirements - Block Quantity Confirmation In Delivery Blocks - "Deliveries: Blocking Reasons/Criteria".
    Here you have the option of creating new entries. For a particular delivery block you have multiple controls as under:
    Sales order block - Indicates whether the system automatically blocks sales documents for customers who are blocked for delivery in the customer master.
    Confirmation block - With this you can control in addition to blocking delivery, also block the confirmation of order quantities after an availability check during sales order processing. So MRP won't be affected because no requirements will be transferred to MRP even when the sales order is saved.
    Printing block - Indicates whether the system automatically blocks output for sales documents that are blocked for delivery.
    Delivery due list block - The system does not take a sales document with this block applied into account when creating a delivery due list. You can only create deliveries manually for sales documents with this type of block.
    Picking block - block for picking goods in delivery.
    Goods issue block - Will not allow goods issue if the block is active.
    1. I would suggest create a new delivery block with a suitable description & only tick on "conf."
    2. Go to VOV8 - select the document type - assign the delivery block.
    Now whenever you will create a sales order with the specific document type, the system will propose the delivery block by default for all customers. If you check at the item level - schedule lines, the system will do the availability check. When you save the sales order, the block will function & the system will not transfer the requirements to MRP. Even if you run the MRP using MD50, the system will not generate the planned order.
    If you assign the reason for rejection to the sales order item, then the system will show the status of that item as complete. If there is only 1 item in the sales order, then the system will change the status of the whole sales order as complete which is not recommended.
    With best regards,
    Allabaqsh G. Patil.

  • Qualification block QB assignment (in SAP HR - Skills Management)

    We have concept of Qualification block QB (in SAP HR - Skills Management) in our project.
    Qualification block is a group of diverse  individual Qualifications/Skills with different scales.
    When we assign a position at least one of the qualifications in a qualification block, the system automatically assigns the qualification block to the position.
    Any idea of how to link our created positions with these Qualification block.
    I assigned Qualifications(Q)  to Qualification block(QB) through IT 1001.
    Also assigned Qualification block(QB) to Position(S) through IT 1055
    In PPPM tcode, when we select position, the list of Qualifications in the qualification block should get displayed.
    But even after the above assignment, the list is not getting displayed
    Can you suggest how this will work.

    as you are aware of transaction pp01 and pppm.
    what you have to do ...
    call transaction code  pp01
    create QB OBJECT  along wit the following infotype..1002,1025,1033,1055.
    through infotype 1055, you can as many Q object as you want.
    then call the transaction code pppm  (assign QB object )
    you can then assign qualification blocks to persons as qualifications and to positions, jobs, and tasks as requirements
    for ex, i call object P.
    Choose the Requirement Block tab page.
    Choose Create and select the qualification block you require.
    Enter the required proficiency and the validity period.
    Save your entries.
    When you assign a person at least one of the qualifications in a qualification block, the system automatically assigns the qualification block to the person. (In the interest of data consistency, the user is not provided with a function for direct assignment.)
    if you want to see your assign assign QB object , which will automatically get assign , you can check in infotype 0024 (qualification block tab.
    or if you want to see want all the Q object linked with QB , CALL transaction code PPCP
    select your objec for eg. P
    enter personnel number, select include qualification and execute the transaction.
    here you will find all your QB object, select your object and click on profile on top, there you can check all your Q object.

  • Certificate required during receipts

    Hi Experts,
    I have configured the following in SPRO---> Control key in QM Proc.
    1. QM Control key created with Tick in Certificate required & Block Invoice
    2. Certificate type defined with tick in all the following for
                Certi for each PO item,
                GR items,
                assigned control with certificate error: without lot: Error message; with lot: Status, No skip to Stock
                Enhanced certificate processing
                No certificate check at GR
    3. Material master
        Tick in QM Proc
        QM Control key selected
       Certificate type selected
    But I am not getting error in during GR & UD.
    Pls help me to solve this issue
    Regards,
    VRMP

    Have you tried using the indicator "Certifiicate  check required" in the certificate type?
    FF

  • Is it possible to requery only a single row of a multi record block?

    Hi,
    I have a data block say "Employees". This is a multi record block.
    Requirement: Two users are working on the same form (front end). Now the user1 has made change for EMP1 The same is not reflected in the User2 Session. Hence i need to requery the values updated for EMP1
    Issue: Since there lots of employees listed in this block. i do not want all the employees details to be required (Block requery). Instead is it possible to requery only a perticular record to be required?
    Thanks,
    Vidya

    I think not possible........
    Usually Approach is the data in grid table displays as read only and when user press EDIT button then they can edit one single record in a separate window.
    ooh got one idea.......
    may be create a save point and rollback to that particular save point ....... i hv never tried it but try it might solves your problem
    there is a feature of Clear_form rollback to save point
    PROCEDURE CLEAR_FORM
      (commit_mode    NUMBER,
       rollback_mode  NUMBER);
    Parameters
    If the end user has made changes to records in the current form or any called form, and those records have not been posted or committed, Form Builder processes the records, following the directions indicated by the argument supplied for the following parameter:
    commit_mode     ASK_COMMIT  Form Builder prompts the end user to commit the changes during CLEAR_FORM processing.
    DO_COMMIT  Form Builder validates the changes, performs a commit, and flushes the current form without prompting the end user.
    NO_COMMIT  Form Builder validates the changes and flushes the current form without performing a commit or prompting the end user.
    NO_VALIDATE  Form Builder flushes the current form without validating the changes, committing the changes, or prompting the end user.
    rollback_mode     TO_SAVEPOINT  Form Builder rolls back all uncommitted changes (including posted changes) to the current form's savepoint.
    FULL_ROLLBACK  Form Builder rolls back all uncommitted changes (including posted changes) which were made during the current Runform session.  You cannot specify a FULL_ROLLBACK from a form that is running in post-only mode.  (Post-only mode can occur when your form issues a call to another form while unposted records exist in the calling form.  To prevent losing the locks issued by the calling form, Form Builder prevents any commit processing in the called form.) Edited by: BaiG on Apr 12, 2010 3:49 PM

  • Are Aperture's stringent hardware requirements: a poor management decision?

    I'm very surprised at the management decision to put a hard software block on even relatively new systems. I understand that optimal performance is a combination of processor, speed, RAM & graphics; however, it appears that Apple has chosen a much narrower set of requirements than need be. There are many claims on the web of folks getting Aperture to work acceptably well on unsupported hardware.
    Politically and business-wise, it would have made much more sense to issue a "warning" to the user during the install that there system did not meet the minimum hardware requirements and they would not be eligible for technical support. This would increase the initial sales base, but not put an undue burden on technical support.
    Perhaps, Apple is trying to push users to upgrade to newer hardware. However, I'm afraid they'll just push folks to find alternative software / hardware. This is something I would expect from Microsoft, not Apple.
    The stringent hardware requirements block was a bad project management decision and one that I hope Apple reconsiders.
    Am I the only one feeling this way?
    dual G4 FW800 & dual G5 2.0   Mac OS X (10.4.3)  

    Look at FCP,
    doesn't use any core image or core video
    functionally
    uses BASIC AGP to do the full screen video function
    And it works.
    motion,
    yes Motion uses Core Video mainly and limited Core
    Image and if you have seen it run on a 5200 or a 9600
    it is dog slow
    my DP 1.25 G4 with a 9800 SMOKES a DP 2 G5 with a
    5200 or 9600 wih functions that the GPU uses
    I titled a dvd project with it a year ago using my 1Ghz, 1.25Gbyte, powerbook 12". It worked. In fact, I produced the entire dvd using FCP, motion, livetype, and dvd studio pro on that powerbook. It wasn't blazing fast, but it was possible.
    look at any of the analogue synth emulators or real
    time sound processors.
    what does Core Audio have to do with the GPU?
    That's rather the point. I want my computer to be a general purpose machine. I don't want to have to pay $5k for an aperture horse and another $5k for an FCP horse and another $5k for another computer to run motion and another $5k to run iWork. I want my software to run on one, general purpose computer. If apple's going to require me to buy specialized, potentially incompatible hardware for each application, I'll find another vendor.
    A software architecture that exploits an available gpu is fine, great, even. But a software architecture that requires a specific piece of hardware, (which may easily be incompatible with another software's hardware requirements), is a gross mistake, imo.
    The frustrating part is that the horsepower we need already exists. But it's not being exploited.

Maybe you are looking for