Frequency of job BBP_GET_STATUS_2?
CRM Gurus
Can someone tell me the recommended frequency to run program BBP_GET_STATUS_2 via a background job?
Does it all depend on business requirements?
thanks.
Hi
<b>Which SRM version are you using ?</b>
<u>Please reschedule the jobs for reports CLEAN_REQREQ_UP and BBP_GET_STATUS_2 with a new frequency.... and give the time interval less than 5 minutes.</u>
The data for GR and Invoice go to SRM with the frequency of the standard job BBP_GET_STATUS_2, you can see de details for the job in the master guide, also you have to program one job to update the SC data (program CLEAN_REQREQ_UP) <u>if you wish, give an e-mail to send you the SRM Master guide</u>.
<u>Related links -></u>
Re: Back Ground Jobs in SRM
Re: Follow on doc(PREQ) not created in teh backend
Hope this will help. Do let me know.
Regards
- Atul
Similar Messages
-
somebody knows because the frequency of job is smaller of 1 minute?
thanksHi Paola,
please be a little more elaborate asking questions, otherwise it's difficult to help you. I <b>guess</b>, you are referring to the ABAP Job scheduler or ABAP jobs in general, so please ask the question in the <a href="https://www.sdn.sap.com/sdn/collaboration.sdn?node=linkFnode2-3&contenttype=url&content=https%3A%2F%2Fforums.sdn.sap.com%2Fforum.jspa%3FforumID%3D50">ABAP forum</a>
Regards, Stefan -
Hello,
We are on SRM 5.0 Classic scenario R3 SAP ECC 6.0. we recently had a stackupgrade for all our different components.
Aswe are in a classic scenario we get a prucahse requisition after a shopping cart is approved. The purchaser then creates a purchase order and the program BBP_GET_STATUS_2 and CLEAN_REQREQ_UP updates the shopping cart with the PO information.
For these 2 programmes we have created one Job. The problem that we now encounter is that the job doesn't update the Shopping cart with the PO information. Only the purchase reuiqisition is visisble. If i go to transaction BBP_PD for that shopping cart i see Item in transfer process marked red. From BBP_PD I can run BBP_GET_STATUS_2 and CLEAN_REQREQ_UP and the nthe shopping is updated.
So to summarize: The jobs (BBP_GET_STATUS_2 and CLEAN_REQREQ_UP ) do not update the shopping cart but running BBP_GET_STATUS_2 and CLEAN_REQREQ_UP manually the Shopping cart is updated.
The users who running the jobs seem to be ok.
Any idea ?
ThanksHi,
Check this.
In the backend system, you activate the application SRMNTY in table TBE11 using transaction FIBF so that the purchase order follow-on data is transferred from the SAP backend to the SRM system. In this way, the SRM system is notified when follow-on documents are created in the backend system.
Check teh variant, Check if the Backend system has been changed, Run BDLS if changed
regards,
MRao
Edited by: Mahesh Rao T.C on Oct 14, 2010 12:31 PM -
BBP_GET_STATUS_2 backgoruond job cancelled
Hi SRM gurus
we have BBP_GET_STATUS_2 in SRM with backgoround scheduled as follows:
1. Once every hour (with variant of 10days)
2. Daily once at 2am (with variant of 50days)
3. Once every week at 4am (with variant of 365days).
But from past 2 week the daily and weekly jobs been cancelled. Now the hourly job is also getting cancelled.
We are on SRM5.0 ECS.
with regards
Manjunathhi,
See the foll related threads:
<b>"Spool internal Error" while schduling job BBP_GET_STATUS_2</b>
Re: SRM program running too long BBP_GET_STATUS_2 and is locking carts
related notes:
1078147 BBP_GET_STATUS_2: Error message E 002
1055375 BBP_GET_STATUS_2: Error message during conversion
1025350 BBP_GET_STATUS_2: Error message E 012
1051103 BBP_GET_STATUS_2: Error message E 007 in back end
1084987 Synchronizing documents and status between ERP and SRM
1062223 BBP_GET_STATUS_2: Performance and runtime error
1075526 BBP_GET_STATUS_2: No error message when back end is missing
1078692 BBP_GET_STATUS_2: Error message E 010, no follow-on docs
691569 Error messages in BBP_GET_STATUS_2
878654 BBP_GET_STATUS_2 performance: Enhancement around period
BR,
Disha.
<b>Pls reward points for useful answers.</b> -
BBP_GET_STATUS_2 only works when entering SC number
Hi all.
SRM 4.0.
I am seeing some strange behaviours in my system these days. The job BBP_GET_STATUS_2 is set up in batch to run every 5 minutes, but now I am seeing issues where the documents are not properly updated in SRM.
The case is that in the user's Goods Receipt screen they have the list of purchase orders that are relevant for GR - once the GR has been posted, the PO will disappear from the screen. However, nowadays many POs do not automatically remove themselves from the users' GR screens. When I run the job BBP_GET_STATUS_2 manually and entering the SC number(s) everything works. So it seems the job is not working very well if no SC numbers are maintained in it.
Has anyone else seen this type of strange behaviour? It seems strange to me that this should not be working in batch runs where it would make no sense to type in the SC numbers.
DMHi there.
I have a batch running every 5 minutes with the steps
CLEAN_REQREQ_UP
BBP_GET_STATUS_2
and the spool list tells me:
Update Successful
... 6.466 Data Records Updated (SC Rows)
and 0 Data Records Created (References)
So this should be fine, right? But it is still not removing the purchase orders from the GR screen after this. What could be causing this? -
Content Deployment Job - Schedule
I have a moderlately large publishing site collection with 600+ sub sites and 12000+ pages. I have received requests to increase the frequency of content deployment incremental jobs from once a day to once every hour. I have 4 web servers and 4 app
servers. My question is, will increasing the frequency of the job to every hour degrade the performance of live site? I am asking because as people are accessing my live site, the import job will be importing content too, correct?
I didnt see any best practice around how frequently should the job be run. MS only says that do no have a very long gap b/w jobs but nothing about how frequent is good and what is dangerous.
Regards, MaheshHi Mahesh,
The performance of the content deployment job will be affected by many elements, so it depends on the situation what frequency the job should run.
As the content deployment incremental jobs deploys only the modified content and items directly linked to the modified content, this increases the chance for problems which can cause incremental content deployment to fail.
In case that you have a large content deployment running with several GB of content it can happen that the decompression phase takes longer than 600 seconds, and then there will be a timeout. To avoid such Timeout message, you need to increase
the RemoteTimeout value.
Please check the link below for more details:
http://blogs.technet.com/b/stefan_gossner/archive/2009/01/16/content-deployment-best-practices.aspx
Best regards.
Thanks
Victoria Xia
TechNet Community Support -
Job scheduling in DB13 for DB2 systems
Hi,
I have installed ERP 6.0 SR3 with DB2 v9.1 FP5.
Please let me know what are the jobs I have to schedule in DB13 to improve the DB performance and the frequency of jobs.
Regards,
Nallasivam.DRead
https://www.sdn.sap.com/irj/sdn/db6
--> Database Administration Guide "SAP on IBM DB2 for Linux, UNIX, and Windows"
Markus -
BBP_GET_STATUS_2
Hi,
We are on SRM 5.0 and configuring the extended classic scenario.
Local PO is created sucessfully and when it is trying to replicate the PO to abckend PO, it says that "Error in Process".
I have scheduled the job BBP_GET_STATUS_2 and it is working fine.
I am trying to execute this report again with the Shopping cart number, it says Error E031, Shopping carts item excluded (due to ext. PO status). please let me know how to solve this.
Regards,
CharyHi Chary,
Check if the number range defination is correct. Internal number range of SRM PO should fall in the external number range defined for R/3 PO.
Also check if the material master and material group in the PO exists in R/3.
Regards,
Prashant
Do reward points for helpful answers -
Impact of non running BBP_GET_STATUS_2.
Good morning,
I'd like to have an answer to the following question please : "what is the real impact when the job BBP_GET_STATUS_2 do not run in an SRM production system ; or only rarely ?".
Many thanks for your contribution.
Best regards,
StéphanHi ,
Lets take an example of confirmation creation.
I have created GR confirmation from SRM .GR document will post to back end and after creating material document in back end system will notify to SRM to update SRM PO the final indicators basing on BBP_DOCUMENT_TAB table and which will be taken care by Clean_req job.
After clean_req job the BBP_GET_STATUS2 (scheduled ) Job will update the GR final indicators at SC level by checking the backend PO details whether the GR is fully completed then it will update the final indicators in SC..
If you wont update these final indicators at SC then after GR is full confirmed then also SRM will allow to create GR again.Actually this is wrong.
this is one example where the impact of get_status job and it is required..
Regards
Devi -
Generate Report BBP_GET_STATUS_2 in ECC 6
Do you I need to generate report BBP_GET_STATUS_2 if the SAP version is ECC6.0? Because based on the information I gathered, it seems like when it is SRM only it needs to be generated.
hi,
See the foll related threads:
<b>"Spool internal Error" while schduling job BBP_GET_STATUS_2</b>
Re: SRM program running too long BBP_GET_STATUS_2 and is locking carts
related notes:
1078147 BBP_GET_STATUS_2: Error message E 002
1055375 BBP_GET_STATUS_2: Error message during conversion
1025350 BBP_GET_STATUS_2: Error message E 012
1051103 BBP_GET_STATUS_2: Error message E 007 in back end
1084987 Synchronizing documents and status between ERP and SRM
1062223 BBP_GET_STATUS_2: Performance and runtime error
1075526 BBP_GET_STATUS_2: No error message when back end is missing
1078692 BBP_GET_STATUS_2: Error message E 010, no follow-on docs
691569 Error messages in BBP_GET_STATUS_2
878654 BBP_GET_STATUS_2 performance: Enhancement around period
BR,
Disha.
<b>Pls reward points for useful answers.</b> -
Scheduling the program BBP_GET_STATUS_2
Hi all,
What are the step to schedule the program BBP_GET_STATUS_2?What should be the variant parameters??
Thanks.You can schedule job BBP_GET_STATUS_2 using SA38 transaction by creating a variant. You can assign values to the following fields for the variant
Shopping Carts
Logical System
SCs in Last ... Days
SC Timeframe
Deletion indicator not set
Deletion Indic. not set in BE
Delvry Completed Ind. not set
Delivered Quantity
Invoiced Quantity
Delivered Value
Invoiced Value
Read Item Data First
Memory ID for Export Memory
as per you requirement. -
Hello Expert,
We observe, when deleting SC in SRM 7.0, we sometimes get error message: 'User XYZ, is already processing this document, try again later'.
Here the user XYZ is background job user.
With this user we are running below jobs:
1) BBP_GET_STATUS_2 at 15 mins frequency (Takes on average 350 seconds to finish)
2) CLEAN_REQREQ_UP at 15 mins frequency (Takes on average 25-30 seconds to finish)
Could the job frequency is an issue here?
Pls advise.
Thanks,
DhananjayHi,
When you get the error "User XYZ, is already processing this document, try again later" check the transaction code SM12 you can find some lock entries created by the XYZ user against that Shopping cart that you are trying to delete.Its because the jobs BBP_GET_STATUS_2 or CLEAN_REQREQ_UP would have created a lock entry while updating the shopping cart and at that time you would have tried to delete the shopping cart. Since there is a lock against that shopping cart you would have got the error.
This issue is not because of the job frequency its just a timing issue(Coincidence).
Regards,
Suresh -
Trigger an Event in PI when IDOC is created
Hi Expert ,
I am working on a inbound scenario where multiple IDOC is created from one XML file . When the IDOC is getting posted in ECC then its taking time to get processed . IDOC is getting processed by a batch job RBDAPPO1 and it is scheduled after every 30 min so IDOC processing is delayed for 30 min if it is getting posted just after the completion Job . We can't increase the frequency of job at its taking lot of resource .
So is it possible to create an event when IDOC arrives in ECC which will run the job program i.e instead of running the background job at scheduled time is it possible to raise an event while posting IDOC so RBDAPP01 can be run just after the IDOC is received in ECC.
There is an option of Process IDOC immediately in parter profile but we don't have to go for that as at one time lots of IDOC is being posted in ECC.
Regards,
SaurabhHi,
interesting topic !
I add two info:
1. By using a dummy idoc or a proxy call (interesting idea) to trigger a job, be careful of one point: in PI, when idocs will be sent to ECC... well... PI will use different queues (SMQ2)... And so maybe your dummy idoc will use a "faster" queue and will arrive before all the other idocs. and the result will be not this one that you want !
Solution: EOIO by specifying a unique queue... but in that case, if there is a pb between PI and ECC, it's ALL the next exchanged of this interface which will be in status "wait inside the queue"... until you solve the first error.
2. with you dummy idoc, How to trigger a job to run your idocs... it's "easy", it's just in the function module linked to your dummy Idoc, that either you run directly RBDAPP01 with your specific Message Type, idoc type, parter, date, etc... (*) or your program create a job which to the same (personally I prefer this solution).
(*) as suggest by another guy, do a generic "dummy idoc".
Regards.
Mickael -
Hello -
I have a scenario like this, appreciate if any BW gurus help me.
Existing Cube : A.
Cubes : X(daily snapshot data) and Y(monthly snapshot data).
MultiProvider : Z
Load Latest 3 months (92 requests) data on daily wise into the X cube from already existing cube A.
Once new month load is started, we need to automate the process of loading the previous 3rd month first day data requests from X into Y cube. After successful load into the Y cube, delete all the daily load requests in X cube for the previous 3rd Month (requests) only, so that we have only the daily loads for the current 3 months in the X cube.
Y cube should only contain data for 1st day of every month excluding the current 3 months.
This process needs to be automated when there is a start of New Month.
I wanted to know where exactly I need to write the code for the automation.
Hope this is clear.
Thanks in advance.
PraveenHi Praveen,
You can do this using process chain.
1)You can make the process chain to be trigered by an event.
You do so by having the Job that start the PC start after an Event.
Then you define your set of job's (thru SM37) and make sure these jobs fire the event that starts the PC.
There is an ABAP program you can use that fires events (FM -> BP_EVENT_RAISE).
2)In the Start process of your process chain, select Direct Scheduling and then click the Change Selections icon. This will bring you to the Start Time window where you can put in the Date/Time you want to start your process chain. At the bottom of this window, click on the Periodic job box and you will notice another icon way at the bottom called Period Values. Click this to determine what frequency the job should be rescheduled (ie. daily, weekly, etc.).
3)you can do this by right-clicking on the start process, selection "Maintain Variant", choose "Direct Scheduling". Next select "Change Selections", pick your date and time, set your period values and you're done.
Hope this helps.
****Assign Points If Helpful****
Thanks,
Amith -
Export and Import data or Matadata only
Aloha,
I need to update an instance, update data only with a full dump. What parameter should i use on import. How can i drop the schema without touching the metadata and database objects?
Thanks in advance.
HadesTheHades0210 wrote:
Hi,
Yes, i will import a full dump export but data only that i need to import and existing db_objects and metadata on the instance should not be touch.
Regards,
Hadeseasy as pie
[oracle@localhost ~]$ impdp help=yes
Import: Release 11.2.0.2.0 - Production on Mon Feb 4 19:57:10 2013
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the first parameter on the command line.
The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.
For example, ATTACH=job_name.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
CONTENT
Specifies data to load.
Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keywords are: SKIP_CONSTRAINT_ERRORS.
DIRECTORY
Directory object to be used for dump, log and SQL files.
DUMPFILE
List of dump files to import from [expdat.dmp].
For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD
Password key for accessing encrypted data within a dump file.
Not valid for network import jobs.
ESTIMATE
Calculate job estimates.
Valid keywords are: [BLOCKS] and STATISTICS.
EXCLUDE
Exclude specific object types.
For example, EXCLUDE=SCHEMA:"='HR'".
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIME
Time used to find the closest corresponding SCN value.
FULL
Import everything from source [Y].
HELP
Display help messages [N].
INCLUDE
Include specific object types.
For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name [import.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
PARTITION_OPTIONS
Specify how partitions should be transformed.
Valid keywords are: DEPARTITION, MERGE and [NONE].
QUERY
Predicate clause used to import a subset of a table.
For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.
For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine data file references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLE
Table names are remapped to another table.
For example, REMAP_TABLE=HR.EMPLOYEES:EMPS.
REMAP_TABLESPACE
Tablespace objects are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N].
SCHEMAS
List of schemas to import.
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC resources.
SKIP_UNUSABLE_INDEXES
Skip indexes that were set to the Index Unusable state.
SOURCE_EDITION
Edition to be used for extracting metadata.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STREAMS_CONFIGURATION
Enable the loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLES
Identifies a list of tables to import.
For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to import.
TARGET_EDITION
Edition to be used for loading metadata.
TRANSFORM
Metadata transform to apply to applicable objects.
Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].
Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES
List of data files to be imported by transportable mode.
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to import.
Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.
The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
CONTINUE_CLIENT
Return to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
START_JOB
Start or resume current job.
Valid keywords are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored where
the default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keywords are: IMMEDIATE.
Maybe you are looking for
-
Why can I not open my emails on my ipad
I cannot open my emails I receive them ok but they won't open
-
How to determine if my system Bios is 786G1?
I just received a HP dc7900 convertable minitower without an operating system and before I install the O/S I want to update to the latest BIOS and then install the latest Intel Management Software. The latest Bios listed says to make sure that you ha
-
Oracle db problem in Global Creation
Hi all, We are experiencing problems at the start of a process itself..It worked well in the development and the QA environment but when it went into production it gave an error...at the start of Global Creation activity.. We suspect this error to be
-
Where Can the Preference Settings to place in the .INI File Found
I can not find a lot of the settings that are found on the drop down list under the Menu item Edit - Preferences. Specifically I am looking for the setting to stop Acrobat asking clients to update. I know it can be done manually. I deliver Acrobat ov
-
FSG Reports : Row Set Account assignments (Display Type to show Account nar
I have an FSG related issue. I've defined a report using a basic row set and a column set. In my row set, I've used account assignments and the display type as 'Expand' to my natural account. Is it possible to show the narration of the account in the