Getting Process ID in BPMProcess
Hello all,
Is there a possibility to get the BPM process id into a context variable in the BPM process itself? Or something that uniquely defines the process?
Thank you.
Hi,
unfortunately it is not possible to get the actual process instance id from the context.
However, you could create a CAF business object and use this as a unique identifier for the process.
Similar Messages
-
Article IDoc not getting processed in PI for particle article type
Hello All,
I have an strange behaviour in my system.
i.e 1, we are creating an article IDOC by using BD10 for particulat article type.
2, IDOC created succesfully , then
3, Process the IDOC in BD87 to push the IDOC to PI system .
4, But i could not see the message in PI that too only for one article type.
I'm wondering why this is not getting processed only for one article type and not for others.
we also checked the basic filter for the message type in dist model and even i could not see any condition in PI also,
Is there any thing else i should check? can you please help us on this.
Regards , Sethu.Hi,
Try to reimport the metadata of IDOC into PI in idx2. Also check the port definition in ECC and whether the segment is released or not. Please go through the below discussions it may help you.
EDISDEF: Port XXX segment defn E2EDK35 in IDoc type ORDERS05 CIM type
IDoc 0000000000181226 was saved but cannot or should not be sent
Regards,
Priyanka -
Automatic update of number ranges in FBN1 is not getting processed
Automatic update of number ranges in FBN1 (current status) level is not getting processed.
Tried posting a customer payment document in the system though number ranges are maintained for the year 2015 the number ranges are not overlapped last year all documents are closed but still the numbers are not been picked automatically in FBN1 kindly help in this issueHello Rajesh
Please verify if number ranges are maintained correctly, verify same using transaction OBA7. Also ensure that external assignment is unchecked in FBN1. Share screenshots if possible.
Regards
Aniket -
Savings Plan Contribution is not getting processed in Payroll
Hello Friends,
We are in the process of upgrading our system from 4.0 to 4.7, After correcting the schema as per the release/OSS notes, we are getting Savings Plan Contribution as blank (Retirement Savings).
Cumulation values are coming thru right but current period contribution is missing.
Earlier we had P0169 with P61, now we do not see processing class(PCLS)61 in the PCLS list. Our earlier wage types were referring to PCLS 61, I am not sure if this is causing an issue.
Please let me know if you have any faced similar situation and have an answer.
Thanks & Best Regards
raviRavi,
Appriciate if you could share the solution for your problem in savings plan contribution is not getting processed in the payroll. As I am facing the same kind od problem for my new configured benefit plan ROTH 401k plan.
Please let me know the solution.
Thanks & Regards,
Sunita -
Bulky Message are not getting processed
Hi All,
Recently we have upgraded our XI systems to EHP4.
After upgrade Bulky files are not getting processed into XI system. Small files are getting processed successfully.
Earlier we had faced same issue at that time we have maintained some parameters in transaction RZ10, RZ11, SXMB_ADM-->
Integration Configuration--> HTTP time out parameter.
Same parameters are still there. But now the message is not getting processed in XI when we check into sxi_monitor. Message mapping is also not seems to be a problem as we havnt put any bulky logic over there.
The message is not getting processed ahead of Receiver Grouping . Please find attached log from Performanceheader
<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
- <!-- Receiver Grouping
-->
- <SAP:PerformanceHeader xmlns:SAP="http://sap.com/xi/XI/Message/30">
- <SAP:RunTimeItem>
<SAP:Name type="ADAPTER_IN">INTEGRATION_ENGINE_HTTP_ENTRY</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134229.930472</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="ADAPTER_IN">INTEGRATION_ENGINE_HTTP_ENTRY</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134232.052577</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="CORE">INTEGRATION_ENGINE</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134232.059576</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="CORE">INTEGRATION_ENGINE</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134232.07151</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="DBQUEUE">DB_ENTRY_QUEUING</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134232.071518</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="DBQUEUE">DB_ENTRY_QUEUING</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134237.239947</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_RECEIVER_DETERMINATION</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134237.241179</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_RECEIVER_DETERMINATION</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134237.250385</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_INTERFACE_DETERMINATION</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134239.999045</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_INTERFACE_DETERMINATION</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134240.001292</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_RECEIVER_MESSAGE_SPLIT</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134240.001413</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="PLSRV">PLSRV_RECEIVER_MESSAGE_SPLIT</SAP:Name>
<SAP:Timestamp type="end" host="evoxicqab">20091210134240.026156</SAP:Timestamp>
</SAP:RunTimeItem>
- <SAP:RunTimeItem>
<SAP:Name type="DBQUEUE">DB_SPLITTER_QUEUING</SAP:Name>
<SAP:Timestamp type="begin" host="evoxicqab">20091210134240.026164</SAP:Timestamp>
</SAP:RunTimeItem>
</SAP:PerformanceHeader>Hi
Have you checked this thread, same discussion here
Performance of XI Interfaces
Also check this blog
/people/michal.krawczyk2/blog/2006/06/08/xi-timeouts-timeouts-timeouts
Regards
Ramesh -
ALE Change pointer BDCPS table gets processed again and again
Hello,
In Test system, when we run BD21 for DEBMAS, same change pointers getting processed again and again. This did not happen in Dev.
We did client copy for building Test system from DEV (Gold client).
Is it possible to carry over few table entries from Dev (even though it is gold client) and these entries cauing aboev problem while runnign BD21?
Is there any way to get rid of it? Will BD22 help?
Thanks.
Rupali.>
Rupali K wrote:
> Hello,
>
> In Test system, when we run BD21 for DEBMAS, same change pointers getting processed again and again. This did not happen in Dev.
> We did client copy for building Test system from DEV (Gold client).
>
> Is it possible to carry over few table entries from Dev (even though it is gold client) and these entries cauing aboev problem while runnign BD21?
>
> Is there any way to get rid of it? Will BD22 help?
>
> Thanks.
> Rupali.
Hi Rupali.
BD22 is used for erase change pointers. Why don't use WPMU ? or WPMA ? Using transaction WPMU, this
take change pointers, and mark it like processed. I'm using WPMU, and it's working.
Regards. -
MDB messages dont get processed from Queues when involving a remote Topic in transaction
Using WLS 6.1 SP4 on winXP Pro boxes, I have come across a peculiar problem:
I have several MDBs that process ObjectMessages from queues and forward their payload (wrapped in another new ObjectMessage) to other queues, all of which are located within the same WLS server.
Right now I'm adding a new MDB that gets messages from a remote Topic with a durable subscription, and forwards the payload to local queues after some processing.
When the Topic is local as well, there is no problem. But when the Topic is set up in a remote machine, only the MDB that has the remote durable subscription works the way it should. It receives the remote message and forwards it to the corresponding local queue. But then the messages in those local queues dont get processed. The 'Messages Received' count rises and the 'Messages' count stays at 0, as if the messages had been correctly processed and acknowledged, but no onMessage() method is called besides the one from the MDB that has the durable subscription to the remote Topic (I can tell because there's no further processing from the queue those messages get put in). It's as if those messages were simply received and acknowledged without being passed to other MDBs by WLS.
* All queue MDBs use Required container-managed transaction management and auto-acknowledge
* All queue MDBs have default durability for their queue subscriptions
* The topic MDB has a durable subscription stored in a filestore
* Lookup of the remote Topic is done via JNDI
Since the processing and forwarding of messages occurs the way it should when everything is local, I am inclined to believe one of two things:
a) There's some issue with the way WLS treats messages (or even just payloads) when they come from a remote server
b) WLS is doing something I'm not aware of when propagating a transaction that begins with the delivery of a message from a remote JMS Topic when it involves further forwarding of messages in local JMS Queues.
Any help will be appreciated.
regards,
.munir estevane
Is the durable subscriber forwarder rolling back its transactions?
That would cause the behavior you describe (eg the message gets
placed in the queue, but is never made visible). What do
the pending counts on the destination queue look like?
Munir Estevane wrote:
> Using WLS 6.1 SP4 on winXP Pro boxes, I have come across a peculiar problem:
>
> I have several MDBs that process ObjectMessages from queues and forward their payload (wrapped in another new ObjectMessage) to other queues, all of which are located within the same WLS server.
> Right now I'm adding a new MDB that gets messages from a remote Topic with a durable subscription, and forwards the payload to local queues after some processing.
>
> When the Topic is local as well, there is no problem. But when the Topic is set up in a remote machine, only the MDB that has the remote durable subscription works the way it should. It receives the remote message and forwards it to the corresponding local queue. But then the messages in those local queues dont get processed. The 'Messages Received' count rises and the 'Messages' count stays at 0, as if the messages had been correctly processed and acknowledged, but no onMessage() method is called besides the one from the MDB that has the durable subscription to the remote Topic (I can tell because there's no further processing from the queue those messages get put in). It's as if those messages were simply received and acknowledged without being passed to other MDBs by WLS.
>
> * All queue MDBs use Required container-managed transaction management and auto-acknowledge
> * All queue MDBs have default durability for their queue subscriptions
> * The topic MDB has a durable subscription stored in a filestore
> * Lookup of the remote Topic is done via JNDI
>
> Since the processing and forwarding of messages occurs the way it should when everything is local, I am inclined to believe one of two things:
> a) There's some issue with the way WLS treats messages (or even just payloads) when they come from a remote server
> b) WLS is doing something I'm not aware of when propagating a transaction that begins with the delivery of a message from a remote JMS Topic when it involves further forwarding of messages in local JMS Queues.
>
> Any help will be appreciated.
>
> regards,
> .munir estevane
-
How to get process instance id
Hi,
We are invoking a long lived process from a client using Java API (EJB end point).
We are getting the JobId in return. Is there a way to get the process instance Id through API.
Searched a lot in the documentation. Finally inclining to query the table directlyTB_PROCESS_INSTANCE on LONG_LIVED_INVOCATION_ID.
Appreciate a lot if any expert point us to an API way of doing the sameWe have tried to use the following approach to get process instance id right after invoking a process.
- Define an OUT process variable to contain process id
- A set value step right after invocation sets this out variable with process id variable value which is available by default
- InvocationResponse.getOutputParameters() to read the output variable
The problem is in the map returned by getOutputParameters() is not having our output variable , but just the JobId.
Anyone tried this before? Are we missing sometiing?
Any expert help is highly appreciated.. -
Excluding one or some columns in output of Get-process cmdlet doesn't work
hello
in PS 4.0 i need a command so that the output includes all columns except for, one or two columns ?
for example get-process cmdlet shows seven columns. i need a cmdlet to show all columns except for handles & cpu columns.
( instead of running Get-process | select -object NPM,PM,WS,VM,CPU,ID )
note: i tested the following command but it still shows handles & cpu columns in the output
( Get-process | select -excludeproperty handles,cpu )
thanks in advancedBill should have also pointed out that you can change the "Types" files to adjust output permanently. I am sure that this would save you much typing.
PowerShell is very flexible. We can select properties with wild cards...
get-process|select s*,c* -first 3|ft -auto
You can adjust this as needed.
¯\_(ツ)_/¯
hi jrv
i didn't understand you first post at all:
"have a better and more permanent solution.
Print out put to very large wide E or F size plotter in landscape mode....
Take scissors and cut out columns not needed. Tape bits together. NOw you have the results you like."
but i undrestood your 2nd post, that nice.
thank you very much,
i also found this workaround:
Get-Process | Select-Object -Property * -ExcludeProperty cpu,handles | format-table -Autosize
regards -
I have an Non recurrening Multiple Entries allowed element to an assignment which is not getting processed when i run payroll.
Please help
ThanksThis is to ensure that results generated by the future payroll runs are correct. Lets take a simple eg.
You have a pension element which would be 10% of Gross Pay_YTD.
Run Payroll for Jan (Gross Pay) = 1000 -> Pension = 100 (10% of 1000).
Run Payroll for Mar (Gross Pay) = 1000 -> Pension = 200 (10% of 2000).
Run Payroll for Feb (Gross Pay) = 1000 -> Pension = 200 (10% of 2000 as _YTD of Feb does not account March figures).
In the above case the Payroll Run results generated by March would be incorrect.
So once you run Payroll for March, you can not run payroll for Feb. If you require any additional payments to be made in Feb (after Payroll for Feb and March are processed), you can run Retropay to bring forward the entries to next Open pay period. -
EDI Message output gets processed for held PO
Hi All,
When I save my purchase order as a HELD PO (EKKO-MEMORY=X), EDI Message output gets processed.
Can you please provide me any inputs to disable this.
Regards
DeepakHi All,
Can anyone provide any inputs on this.
Regards
Deepak -
IDOCS Error - Not getting processed automatically
Hi All,
We are loading hierarchy for a Product from R/3 system using the standard datasource.
When we execute the info package, IDOCs are not getting processed automatically.
We are facing the below error message.
Error when updating Idocs in Business Information Warehouse
Diagnosis
Errors have been reported in Business Information Warehouse during IDoc update:
No status record was passed to ALE by the applicat
System Response
Some IDocs have error status.
Procedure
Check the IDocs in Business Information Warehouse . You do this using the extraction monitor.
Error handling:
How you resolve the errors depends on the error message you get.
But when we goto BD87 and process the IDOCs manually, these are getting posted and the hierarchy is loading.
Can someone please guide me on what is the issue with the IDOCs and how to make them to post automatically.
Thanks in Advance
Regards,
SachinHi,
This will happen due to Non-updated IDOCu2019s in the Source system i.e., occurs whenever LUWu2019s are not transferred from the source system to the destination system. If you look at RSMO of the status tab, the error message would appear like u201CtRFC Error in Source Systemu201D or u201CtRFC Error in Data Warehouseu201D or simply u201CtRFC Erroru201D depending on the system from where data is being extracted. Sometimes IDOC are also stuck on R/3 side as there were no processors available to process them. The solution for this Execute LUWu2019s manually. Go to the menu Environment -> Transact. RFC -> In the Source System from RSMO which will asks to login into the source system. The u201CStatus Textu201D for stucked LUWu2019s may be Transaction Recorded or Transaction waiting. Once you encounter this type of status Execute LUWu2019s manually using u201CF6u201D or Editexecute LUWu2019s(F6).Just keep on refreshing until you get the status u201CTransaction is executingu201D in the Production system. We can even see the stuck IDOCu2019c in Transaction BD87 also.Now the data will be pulled into the BW.
Hope it helps a lot.
Thanks and Regards,
Kamesham -
Data packet not getting processed
Hi SDN's
I m loading data from one ODS to 4 regions , the source ODS is successfully loaded from der to the data targets the load is getting failed or taking loang time .
upto transfer rules the data is successful, in update rules data packets are not getting processed
kindly suggest solution, points will be assigned
thx in advanceHi Katam,
In the target ODSs go to the monitor screen for a particular request -> in the menu bar go to environment -> transactRFC-> in the datawarehouse -> give the ID and date -> execute.
Check if there are entries in that. Usually this queue will be stuck and you need to execute the LUWs in the queue manually.
if it says transaction recorded u need to execute it manually
Please revert if any issues
Edited by: Pramod Manjunath on Dec 19, 2007 4:48 PM -
IDOC status 64 not getting processed.
Hi Gurus,
I have a problem where IDOC's with status 64 are not getting processed via background job. the IDOCs weere getting processed with the background job till today evening, but suddenly no changes were made and the processing of IDOC's with status 64 has stopped and they keep filling up, when i checked the log of the Background job it says finished, and the log has the following information,
17.09.2009 19:05:23 Job started 00 516 S
17.09.2009 19:05:23 Step 001 started (program RBDAPP01, variant POS_IDOCS, user ID SAPAUDIT) 00 550 S
17.09.2009 19:05:25 No data could be selected. B1 083 I
17.09.2009 19:05:25 Job finished 00 517 S
Kindly advise on this.
Regards,
Riaz.When i process the IDOC's using BD87, they gets processed wiwth out aany issues no dump is displayed, idoc gets a status 53 after processing via BD87.
There is another problem how ever there are other jobs scheduled to run in the background other than this background job which are getting cancelled and have two different error logs for different jobs they are as follows.
1) 18.09.2009 02:00:06 Job started 00 516 S
18.09.2009 02:00:06 Logon not possible (error in license check) 00 179 E
18.09.2009 02:00:06 Job cancelled after system exception ERROR_MESSAGE 00 564 A
2) 18.09.2009 00:55:27 Job started 00 516 S
18.09.2009 00:55:27 Step 001 started (program RSCOLL00, variant , user ID JGERARDO) 00 550 S
18.09.2009 00:55:29 Clean_Plan:Gestartet von RSDBPREV DB6PM 000 S
18.09.2009 00:55:29 Clean_Plan:beendet DB6PM 000 S
18.09.2009 01:00:52 ABAP/4 processor: DBIF_RSQL_SQL_ERROR 00 671 A
18.09.2009 01:00:52 Job cancelled 00 518 A
This is a production server and the Licence is valid till 31.12.9999, and has no issues with the license. -
SM35 session is not getting processed in background
HI Folks,
I am using BDC (open group, Insert group, Close Group) to create a SM35 session and calling RSBDCSUB program to process created session.
this program is being executed in background job,
Issue is, when this program is scheduled to run with my user ID(dialog user) is running fine i.e. session is getting processed completely.
however, when job is schedule with background user, session is getting created with status NEW, but not processed, we have to go to SM35 and manually process that session.
Does anybody come across such issue, What could be the reason, I could see authorization might be the problem.
please advice.
Thanks,
Brahma,Hi brahmanandam,
You need use RSBDCSUB and submit it using the session name u generated...
SUBMIT RSBDCSUB AND RETURN
USER SY-UNAME
using selection-set 'VAR'.
Where VAR is ur variant name
Maybe you are looking for
-
Class window in JDeveloper 10.1.3.3.0
Hello i am developring javabeans such as validator, textbox with password-numeric-alphabetics . in the documentation they have mension about class editor i can't find it in developer blog i read that for class window i have to update.after updating a
-
[solved]xorgconf command no found
I just installed the base system and trying to set up a desktop. I installed xorg but there is no xorg.conf, so I've to generate one by myself. I want to use this command: xorgconf but error shows it not exist. how can I install xorgconf? TKS P.S: I'
-
I need to know how to reset my security questions so I can use my money from a gift card?
-
IPhone 3Gs no sound after 6.1.3 update
So I was just playing some music with my iPhone 3Gs when suddenly the sound goes off.. there was nothing.. even keyboard ticks everything theres no sound.. The sound works in earphones and also in calls (earpiece) but the speakers are not working.. I
-
Preview won't save a new signature
I successfully use my own digital signature on preview documents, but when I have tried to add my husbands via "manage signatures" or even directly from "Create signature from Facetime HD Camera", it shows his signature as we hold it up, but when we