Expense reports are not processing for future date?
hi folks,
in our co.expense reports are not processing for future date, even though the end date is one month from now the expense reports are not getting paid.
can anybody assist me to come out of this issue.
thanks in advance,
regards
bhanu
HI,
Could you please share how to go Debug mode in Dymanic program, I have scenarion in SAP HR , when Employee is hire , during the hiring action Infotype 32 is updating based on following conditions
********INSERT IT0032 FOR CANDIDATE ID *************
P0001-PERSG<>'5' - True
P0000-MASSN='01'/X
P0000-MASSN='H1'/X
P0000-MASSN<>'84'/X
P0000-MASSN<>'86'/X
P0000-MASSN<>'99'/X
GET_PNALT(ZHR_CREATEP0032)
RP50D-FIELDS <> ''
INS,0032,,,(P0000-BEGDA),(P0000-ENDDA)/D
P0032-PNALT=RP50D-FIELD1
P0032-ZZ_COMPID='05' _ True
Infotype record is being created but there is no data in "RP50D-FIELD1 , so i tried to debug the subroutine GET_PNALT(ZHR_CREATEP0032) as mention in Dynamic action, its not going in debugger mode.
Do you have any Idea about how to debug the program. used in dynamic action
Regards,
Madhu
Similar Messages
-
Reports are not posting with report repository webserver configured for Sin
Hi Everyone,
We have configured Single Signon on our Test environment (UADB1) using Sun Authentication Manager. Everything went well, we can login using our LDAP accounts except for one thing. The reports are not posting to the report repository.
Our setup goes like this. We have used only one webserver for login and for report repository purposes. SSL certificate was configured in the webserver and we are using https in the report node. Both URLs https://dv001.test.com:8450 and http://dv001.test.com:8400 were configured for Single Signon.
Report Node Definition
Node Name: uadb1
URL: https://dv001.test.com:8450/psreports/uadb1
Connection Information
https
URI Port: 8450
URI Host: dv001.test.com
URI Resource: SchedulerTransfer/uadb1
Below is the error I am getting. If I will use another webserver which is not the Single Signon configured as report repository the reports are posting. So, I am thinking this has something to do with the Single Signon setup and SSL. ANy idea? Thanks.
PSDSTSRV.2093190 (10) [06/13/10 01:05:43 PostReport](3) 1. Process Instance: 9499/Report Id: 8465/Descr: Process Scheduler System Purge
PSDSTSRV.2093190 (10) [06/13/10 01:05:43 PostReport](3) from directory: /psft/pt849/appserv/prcs/UADB1/log_output/AE_PRCSYSPURGE_9499
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](1) (JNIUTIL): Java exception thrown: java.net.SocketException: Unexpected end of file from server
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](3) HTTP transfer error.
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](3) Post Report Elapsed Time: 0.2300
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](1) =================================Error===============================
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](1) Unable to post report/log file for Process Instance: 9499, Report Id: 8465
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](2) Process Name: PRCSYSPURGE, Type: Application Engine
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](2) Description: Process Scheduler System Purge
PSDSTSRV.2093190 (10) [06/13/10 01:05:44 PostReport](2) Directory: /psft/pt849/appserv/prcs/UADB1/log_output/AE_PRCSYSPURGE_94Duplicated thread : Reports not posting if using Single Signon webserver as report repo
Nicolas. -
New DSO data fields are not available for transformations in the upwardflow
Hello,
There are data flow from
many source DSOs -> Target DSO -> Final DSO.
Different transformations are defined and activated with DTPs. This works fine and provides the desired results.
now new fields are added to the target DSO, final DSO and one of the source DSO.
When I now try to edit the transformations, the new fields are not part of the sources at the different stage.
The fields are available in the Data Source created out of a DSO but not directly when using a DSO as a data source.
Have anyone faced such a problem?
any solutions to this?
- I cannot delete all the other transformations as this is too much work to do and there is no time for it.
Regards
KannanWe created ZMD_BS_MAT_OVP_LAYOUT. But the situation has not changed - the fields are not available for edit.
-
Job getting cancelled for 'Chain not processed for date-2007031920070318' ?
Hi All,
We have 2 process chains running only on weekdays - one for sales and the other for billing. A cancelled job is created for one of the process chains with the job log message as 'Chain not processed for date-2007031920070318' for the not run days whereas the other does not have any issues though the same monitor job program runs for both. The difference is only in the variant but nothing much other than that its maintained for 2 different chains.
We also debugged the monitor job program. It basically does nothing other than displaying messages depending on the status of the chain log.
Please let me know your suggestions as to why it happens.
Thanks,
krishnait is also giving following error message:
Caution: Program has changed
Caution: At time of termination, Active source code no longer available and after above two statements its giving sign "?" before every statement in below manner:
024300 ? lit_ekpo_dummy1[] = it_ekpo[].
024310 ? DELETE lit_ekpo_dummy1 WHERE ebeln NE tab1-ebeln.
024320 ? DESCRIBE TABLE lit_ekpo_dummy1 LINES count01.
024330 ?
024340 ? REFRESH lit_ematn_dummy[].
024350 ? lit_ematn_dummy[] = lit_ematn[]. -
Report Painter values are not populating for a particular cost center
Report Painter(GR55) values are not populating for a particular cost center for the months of April 1 2011 where as for May onwards values are displaying correctly..We have check the transaction code KSB1 for this cost center there are some values posted for the period of April..
Can any one help me to find out where exactly the issue is and how to resolve the same..Please check your variables in your report.
-
You are not licensed for TA Adapter. Service disabled.,
we had error message like
TIDAL Log shows “ You are licensed to connect the client” to run the TIDAL batch job it’s failing.
4375678,09/03/2014 03:15 AM,Audit,You are not licensed for WebService Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375677,09/03/2014 03:15 AM,Audit,You are not licensed for TA Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375676,09/03/2014 03:15 AM,Audit,You are not licensed for Horizon Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375675,09/03/2014 03:15 AM,Audit,You are not licensed for MSSql Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375674,09/03/2014 03:15 AM,Audit,You are not licensed for OracleApps Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375673,09/03/2014 03:15 AM,Audit,You are not licensed for PeopleSoft Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P,,
4375672,09/03/2014 03:15 AM,Audit,You are not licensed for Oracle Database Adapter. Service disabled.,Client,,,,,1000,ENAW-VXD21P..
after reboot our master server issue got fixed.. could you please help me to understand what was issue .. how to avoid this situation in future
Regards
Karthikeyan JThe fact that if you use 1280 it allows successful calls suggests your line is set as CPS , carrier pre select, if this has just happened it suggests a CPS provider has taken your 'line' for calls if not for line rental, but it's not properly set up otherwise calls would terminate it would just be billed by the mystery CPS provider
Have you had any approach by company's that sell this product ?, sometimes ISP's who's product includes calls can use CPS, if not, and you haven't been 'slammed' you need BT to set your line as non CPS , so BT handle your calls automatically without using 1280
This should be easy to get done, but the drones that answer the calls probably won't have a clue, hence the 'it must be your handset' nonsense they come up with, report the line faulty,don't accept an appointment, and insist they sent the problem to what was once called BT Operate, they look after the telephone exchange equipment and it's 'data', -
SharePoint 2013: A web analytics report is not available for this site.
Hello,
I have SharePoint Server 2013 and the enable usage data collection is enabled. There is usage logs and data being generated, but SharePoint says there is not data available.
A web analytics report is not available for this site. Usage processing may be disabled on this server or the usage data forthis site has
not been processed yet.
What can I do?
Thanks,
Paul
Paulhi,
Web Analytics is now part of the 2013 Search.
You can refer the below links, :
http://blogs.msdn.com/b/chandru/archive/2013/08/31/sharepoint-2013-web-analytics-report-where-is-it.aspx
http://www.collabshow.com/2013/05/23/sharepoint-2013-analytics-a-big-step-backward/
http://usamawahabkhan.blogspot.com/2013/06/sharepoint-2013-analytics-features-how.html
if you wanna use API, there is a blog,which talk about the same:
retrieve Search Analytics Reports
using SharePoint 2013 API:
http://radutut.wordpress.com/2013/01/27/how-to-get-search-analytics-reports-programmatically-in-sharepoint-2013/ -
Payment method(s) are not allowed for this program / Program RFFOAVIS_FPAYM
Dear Friends,
In Se38 abap editor, I entered the program RFFOM100, i entered the details program run date,identification future,paying company code,payment methods T, House bank, Account Id and executed the program. I am getting the error 'Payment method(s) are not allowed for this program'.
Payment method S is not used in F110, It was not mentioned in vendor master or vendor invoice.It is not mentioned in fbzp in bank determination. When i tried to delete payment method 'S' the system shows a message Payment Method S is being used in the company code.
In the program RFFOAVIS_FPAYM, i entered the details program run date,identification future,paying company code,payment methods T, House bank, Account Id and executed the program. I am getting the error 'Program RFFOAVIS_FPAYM: No records selected'. But there open items or invoices.
Please suggest how to resolve the errors.
Regards
SridharHello,
Why are you directly executing these programs?
You will not expect these program to execute directly by going to SE38.
I would have configured the relevant payment methods in the country with DME structure or classical RFFO* programs, which in turn pick up the inherent programs at the time of running F110.
For example, if you need to generate BACS file for country GB, either you copy the GB_BACS to ZGB_BACS or use GB_BACS directly in FBZP settings.
Make sure that you have maintained OBPM1 / OBPM2 / OBPM3 and OBPM4 settings properly.
The system automatically gives the access to system RFFOAVIS_FPAYM. Prepare a variant against this program in F110. Relevant file gets generated to folder path mentioned in OBPM4 will get generated. The file get generated can be seen in AL11.
Hope this is informative and solves your issue.
Regards,
Ravi -
Time series functions are not working for fragmented logical table sources?
If i remove that fragmented logical table sources, then its working fine.
if any body know the reason, please let me know.
thanks and regards,
krishnaHi,
is because the time series function are not supported for the framentation content, see the content of the oracle support:
The error occurs due to the fact the fragmented data sources are used on some Time series measures. Time series measures (i.e. AGO) are not supported on fragmented data sources.
Confirmation is documented in the following guide - Creating and Administering the Business Model and Mapping Layer in an Oracle BI Repository > Process of Creating and Administering Dimensions
Ago or ToDate functionality is not supported on fragmented logical table sources. For more information, refer to “About Time Series Conversion Functions” on page 197.
Regards,
Gianluca -
Reports are not updated on the dashboard
Hello guys
I have created some reports and dashboards in OBIEE. Everything looks good. The only problem is that they are not updated with new data as the days go by.
In other words, the next day and the day after when users log in to view the same report, they are still seeing the same old data where it should be updated already when new data were already loaded into the db. When clicking on refresh botton, it still didn't update..
I believe there must be some caching issues that causes it, but cache is needed as a part of performance tuning. So how would I be able to update the record and not slowing the performance for reports that are updating daily?
Please helpHi,
This certainly sounds like a caching issue, you need to decide on a method for purging the cache of stale entries. My personal favourite is a custom informatica WF added as a following task to the DAC execution plan. This WF calls as OBDC function against the Analytics ODBC which purges the cache, ensuring that there are no stale entries after the ETL has finished.
For more info on caching please see my blog entry here:
[http://obiee-tips.blogspot.com/2009/09/obiee-query-caching.html|http://obiee-tips.blogspot.com/2009/09/obiee-query-caching.html]
Regards,
Matt -
Reports are not able to run coz of Block Corruption
Hi all,
I have reported some Oracle finance reports are not running.
Once i got the call i looked into the user trace file, there i found the below :
Corrupt block dba: 0x2100f345 file=33. blocknum=62277. found during buffer read
on disk type:0. ver:0. dba: 0x00000000 inc:0x00000000 seq:0x00000000 incseq:0x00000000
Entire contents of block is zero - block never written
Reread of block=2100f345 file=33. blocknum=62277. found same corupted data
table scan: segment: file# 15 block# 84
skipping corrupt block file# 33 block# 62277
Corrupt block dba: 0x2100f346 file=33. blocknum=62278. found during buffer read
on disk type:0. ver:0. dba: 0x00000000 inc:0x00000000 seq:0x00000000 incseq:0x00000000
Entire contents of block is zero - block never written
Reread of block=2100f346 file=33. blocknum=62278. found same corupted data
table scan: segment: file# 15 block# 84
skipping corrupt block file# 33 block# 62278
Then i try to find out which table is affected by the below queary :
SQL> select segment_type,owner,segment_name from dba_extents where file_id=33 and 62278 between block_id and block_idblocks -1;+
SEGMENT_TYPE OWNER
SEGMENT_NAME
TABLE GL
GL_JE_LINES
So this above table GL_JE_LINES is the affected table is init.
Now what is the action i have to take. In this database there is no archive log enabled and version of this database is 7.
Can any one sugges me what is the action i have to take now.
Regards
HamidDB version :
SQL> select * from v$version;
BANNER
Oracle7 Server Release 7.3.4.3.0 - Production
PL/SQL Release 2.3.4.3.0 - Production
CORE Version 3.5.4.0.0 - Production
TNS for IBM/AIX RISC System/6000: Version 2.3.4.0.0 - Production
NLSRTL Version 3.2.4.0.0 - Production
Alert log :
SID>tail -f alert_SID.log
Tue Sep 25 14:35:03 2012
Errors in file /u20/app/oracle/admin/APRD/udump/ora_84426_aprd.trc:
Tue Sep 25 14:35:03 2012
Errors in file /u20/app/oracle/admin/APRD/udump/ora_84426_aprd.trc:
Tue Sep 25 14:35:03 2012
Errors in file /u20/app/oracle/admin/APRD/udump/ora_84426_aprd.trc:
Tue Sep 25 15:09:18 2012
create tablespace tbs datafile '/stage/software/APRD/oradata/tbsnew.dbf' size 10 0m
Tue Sep 25 15:09:27 2012
Completed: create tablespace tbs datafile '/stage/software/AP...
Trace file :
SID>tail -f ora_84426_aprd.trc
on disk type:0. ver:0. dba: 0x00000000 inc:0x00000000 seq:0x00000000 incseq:0x00000000
Entire contents of block is zero - block never written
Reread of block=2101d366 file=33. blocknum=119654. found same corupted data
table scan: segment: file# 15 block# 84
skipping corrupt block file# 33 block# 119654
Corrupt block dba: 0x2101d367 file=33. blocknum=119655. found during buffer read
on disk type:0. ver:0. dba: 0x00000000 inc:0x00000000 seq:0x00000000 incseq:0x00000000
Entire contents of block is zero - block never written
Reread of block=2101d367 file=33. blocknum=119655. found same corupted
For this above :
SQL> select segment_type,owner,segment_name from dba_extents where file_id=33 and 119654 between block_id and block_id+blocks -1;
SEGMENT_TYPE OWNER
SEGMENT_NAME
TABLE GL
GL_JE_LINES
Regards
Hamid -
GL assignment for future date?
Hi everybody,
We have an issue is as follows:
- We have created a sales order for 08-09-2008
- We have created PO for 08-09-2008, here system will able to post everything fine.
- Again we are creating one more PO for 08-09-2009, here system threws error message like GL a/c needs cost assignment.
My question is why system is asking assignment for future date only and why system is not asking assignment for current date?
Looking forward your best thoughts
regards
vkHi Raj,
Thanks for your reply
Yes, we have assigned only cost element in TC: OKB9. But, for that cost element we have created 2 analysis periods.
- GL 100115 01.01.2008 to 31.12.2008
- GL 100115 01.01.2009 to 31.12.9999
I think validity period is perfect still system threws error message like gl acc needs cost object assign.
regards
vk -
Expense Reporting having incorrect value for org_id
Hello team,
Some of our expense reports are having issue while populating the org_id on ap_expense_report_headers_all table. Expense report was created in UK but having the org_id value for US. This is happening randomly and not to all the expense reports in the system.
Please feel free to let us know if anyone of you came across the same issue. Parallely I have opened a SR with Oracle team.
Application release Version 12.1.1
Thanks
AbhiHi Abhishek,
are you creating expense reports from a single responsibility all the time or you have different responbilities...?
If so check the profile option values attached to the mo:operating unit ....since system would populate the ORG_ID column in the ap_expense_report_headers_all table based on the value assigned in the profile option.
References:
In Oracle Internet Expenses patchset H, why do we receive a warning saying the employee is inactive when auditing a report? [ID 312203.1]
Regards,
Ivruksha -
RWDPOSUP For future date articles postings.
Hi All,
Currently we are running the program RWDPOSUP for sending articles from sap to third party.
I have a query.the system will pick articles which have a valid date from today or a specified number of days for example 30 days
all the validity is stored in the field DATAB of table A073 for instance(Pricing condition tables)..
So if the pricing has to be changed on march first then it takes the data on feb first and runs the idoc and data is sent to third party.
So if i post a document today i.e., 21st march 2010with date of validity as march first 2011. then ideally it should not pick and create an IDOC.
It should pick on feb first or jan 29th(30days before) and then create idoc. will the program RWDPOSUP run as desired.
for me if the DATAB is more than 30 days it should not create an IDOC only articles withing 30days are to be sent to third party.
Can you please suggest on this. i am searching where it looks for date validation but not succesful in that search.
Any help would be appreciated.
Thanks,
RK.Hi,
I have changed it from W to E and saved , still system is prompting any message at all
document is getting posted for future date also .
in MB1B screen posting date is changeble field and if i try to put tomorrow date and try posting its getting posted
please help
please try at your end once .
regards,
madhu kiran.
Edited by: madhu kiran on Apr 24, 2009 11:19 AM -
Rule XVAL not processing basic pay data.
Hello experts,
While checking the log of payroll run, I find that basic pay data (IT0008 wage-types) are not processed (IT0008 wage-types are not there in the output table IT of schema function - PIT XVAL P10 Valuate pay elements).
I am using partial period factor 2 in processing class 10. Please advice.
Thanks in advance.
Regards
PeeyooshDisplay Log
Factoring and storage
4 GEN/8 6 Create wage types /801 to /810 in IT
4 PIT INP1 NOAB Determine monthly factors
4 PIT XCM0 P31 Monthly lump sums for cost accounting
5 PIT XVAL P10 Valuate pay elements
5 Input
0 IT
5 Processing
0 /806 Partial period factor 6
0 1000 Basic
0 1010 Canteen Subsidy
0 1020 City Comp Allowance
0 1040 Production Incentive
0 1070 House Rent Allowance
5 Output
0 IT
4 PIT INNP Notice Pay Addition / Subtraction
4 PIT X023 P20 NOAB Gross input and storage
Input- IT
Table IT
A Wage Type APC1C2C3aBKoReBTAwvTvNOne Amount/One Number Amount
2 /806 Partial per01 100000.00
2 1000 Basic 01 4,500.00
2 1010 Canteen Sub01 250.00
2 1020 City Comp A01 3,230.00
2 1040 Production 01 1,500.00
2 1070 House Rent 01 2,500.00
Output - IT
Table IT
A Wage Type APC1C2C3aBKoReBTAwvTvNOne Amount/One Number Amount
2 /806 Partial per01 100000.00
Maybe you are looking for
-
How to define user status that forbids editing the document
Hi All, Im working on CRM 7.0 I need to define a user status say " ON HOLD" . when this user status is set the document should not be edited or changed by anyone. However as soon as we move to next status document should be editable. I configured a
-
BUG FOUND!! in Airport Disk programming
I'm betting this is having a bigger issue then most poeple realize as it effects reconnecting to the Airport disk. Ok after lots of testing I've figured out a major bug in the Airport Disk Programming for Security. This has only been verified with th
-
I upgraded our CAM/CAS servers from 4.8.2 to 4.9.0 this weekend
I upgrade the servers from 4.8.2 to 4.9.0 this weekend and all of the servers now crash when introduced to user traffic. The upgrade shows no errors and I have not had a problem in the past the servers are 3355s. Has anyone else upgraded to 4.9.0?
-
Hi All, Can anybody provide the following documents 1)XI Naming Conventions 2)XI Mapping Specification 3)XI "message choreography" 4)Sample project scenario for mapping development and testing 5)Sample project scenario for adapter configuration Even
-
Authorization object to display specfic layout
hi all, i have an ALV output having some colomns to display. we have been asked to display some additional colomns apart from existing ones. 1)Bottler Affiliation" 2) SAP Payer Customer" 3) AAG (vbrk-ktgrd) 4) Legacy Bill To Customer"