Impact of tables in Time & Payroll
Dear Experts,
Can any one please explain me what are the tables which are impacted in Time & Payroll with the changes made in PA&OM.
Thanks&Regards,
Viswanath
The changes in OM will impact PA0001, thus it impacts both Time and Payroll.
PA changes may or may not impact everythng, for ex. change to Communication type will nto impact time or payroll.
Chagnes made to IT0000,1,6,7,9,27,8,14,15,580-586.... depends on country specific again, willl have impact on payroll & time.
Regards
...SAdhu
Similar Messages
-
Hello Gurus
I went through many threads regarding Time & Payroll Integration. I didnt get clear picture of it. Can any one tell me where exactly this integration done. I know it will be done in IT 0007 (If I am not wrong). Can some one explain me in detail.
Points assured..
Regards
RajDear Raj,
Infotype 0007 does have some influence on 0008, but it is limited to the number of hours defined in the work schedule.
In wide spectrum, the time wage types are used and they are validated through the ZL tables in the time evaluation, and are linked into the payroll schema.
Just by configuring the 7 infotype does not ensure that your Time and Payroll are integrated.
Regards
Naaga. -
Join same table 3 times, count from two other tables
Hi all!
I have 3 tables
RECORDS
Id, Record_Id
ITEMS
Id, Record_Id
ARTICLES
Id, Record_Id
I need to join RECORDS table 3 times R1,R2,R3 and get count of items R2 and R3 have and count articles that R3 has.
R2 must have ITEMS and R3 must have items, R3 may have articles. R1 may have multiple children and R2 may have multiple children.
Solution I'm using is following, but distinct makes it slow...
select r1 as ParentRecordId,count(distinct i1) as Volumes,count(distinct i2) as Numbers, count(distinct a1) as Articles
from
select r1.id as r1,i1.id as i1,i2.id as i2,a.id as a1
from records r1 inner join records r2 on r1.id=r2.record_id
inner join records r3 on r2.id=r3.record_id
inner join items i1 on r2.id=i1.record_id
inner join items i2 on r3.id=i2.record_id
left join articles a on a.record_id=r3.id
) as sel
group by r1
order by 1
Regards
MeelisPlease post DDL, so that people do not have to guess what the keys, constraints, Declarative Referential Integrity, data types, etc. in your schema are. Learn how to follow ISO-11179 data element naming conventions and formatting rules. Temporal data should
use ISO-8601 formats. Code should be in Standard SQL as much as possible and not local dialect.
This is minimal polite behavior on SQL forums. And thanks for no sample, too!
>> I have 3 tables <<
No, you have three identical decks of 1950's punch cards written in bad SQL.
There is no such thing as a generic, universal “id” in RDBMS. It has to be the identifier of something particular.
Magical columns appear in your query.
There is no such concept as “child' and “parent” in RDBMS. That was network and hierarchical databases. We have referenced and referencing tables.
We do not use column positions in the ORDER BY cause; any change in the query used in the cursor will screw up everything.
Would you like to try again?
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL -
Table Names for "Payroll Interface Tables"
How can I find out the exact table used in "Payroll Interface Tables" section?
Some of them are Configuration, Field Definition,Definition, Group etc.
We had a consultant who configured the tables but now we want to move the entire "Payroll interface tables"configuration to an UAT environment and the DBA's want the exact table names so they can move/copy/export the tables.
We are on version 9.1
Thanks.The Payroll Interface PeopleBook has a section called "Cloning a Payroll Interface Definition" and it lists most of the records that contain the interface definitions. Unfortunately, the cloning process is for cloning an existing definition to a new system id within the same environment, not for migrating to a different environment. You will need to include some additional tables. In HCM 8.9, we used the following in datamover:
SXPORT PI_SYSTEM_TBL WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_SYSTEM_STAT WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_SYSTEM_LANG WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_PS_RECORD WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_PSREC_FLD WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_PS_REC_LANG WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_INSTANCE_TBL WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_INST_VALUE WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_INSTANC_LANG WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_PROCESS_TBL WHERE PI_PROCESS_ID >= '800000';
EXPORT PI_PROCESS_VAL WHERE PI_PROCESS_ID >= '800000';
EXPORT PI_PROC_TB_LANG WHERE PI_PROCESS_ID >= '800000';
EXPORT PI_PROC_VA_LANG WHERE PI_PROCESS_ID >= '800000';
EXPORT PI_FIELD_TBL WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_FIELD_XLAT WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_FIELD_LANG WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_GROUP_TBL WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_GROUP_LANG WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_DEFN_FILE WHERE PI_FILE_ID='XXX';
EXPORT PI_DEFN_RECORD WHERE PI_FILE_ID='XXX';
EXPORT PI_DEFN_R_LANG WHERE PI_FILE_ID='XXX';
EXPORT PI_DEFN_FIELD WHERE PI_FILE_ID='XXX';
EXPORT PI_DEFN_F_LANG WHERE PI_FILE_ID='XXX';
EXPORT PI_CONFIG_TBL WHERE PI_SYSTEM_ID='XXX';
EXPORT PI_CONFIG_FILE WHERE PI_CONFIG_ID IN ('YYY','ZZZ');
EXPORT PI_CONFIG_LANG WHERE PI_CONFIG_ID IN ('YYY','ZZZ');
EXPORT FILE_HANDLE_LNG WHERE FILE_HANDLE IN ('AA','BB');
EXPORT FILE_HANDLE_TBL WHERE FILE_HANDLE IN ('AA','BB');If you're on a different release you'll have to review the list to see if anything has changed, and set the literals according to your definitions.
Regards,
Bob -
Exclude a table from time-based reduction
Hi,
Iu2019d like to exclude a table from time-based reduction. How can I do this ? Is there any manual how to do customizing in TDMS ?
Regards
p121848Thank you Markus for your annotation.
AUFK is technically declared as an Master Data Table, but stores orders. Standard
TDMS provides a reduction of this file and in the client copies we did via TDMS a lot of records disappeared when we selected time-reduction.
Now we fond out that some Transactions as OKB9 or KA03 refer to old internal orders. So we would like to maintain the customizing, to exclude AUFK from reduction. But this is not possible in activity TD02P_TABLEINFO, because no changes can be done to the tables, which have got the transfer_status 1 = Reduce.
You can manipulate the Transfer-Status in file CNVTDMS_02_STEMP before getting to activity TD02P_TABLEINFO, but I wonder whether this is the way one should do.
Any idea ?
Regards p121848 -
Need help- need to read a customer table from the payroll schema
Hi gurus,
I need to read an amonunt from a customer table in the payroll schema. Is it possible to input TABLE XXXX in the schema, or is impossible to read from a customer table?Do I need to create a report to upload the amount in an infotype?
Thanks in advanced for for support!You should create your own custom operation (based on operation TABLE) to read your custom table from a Payroll PCR.
If your custom table's name is longer than 5 characters, you may be required to use operations VAOFF and/or VALEN prior to your custom operation in the PCR. -
Hi We're using Oracle Applications 11.5.10.2
Is partitioning the ra_customer_trx_lines_all table by time as well as its indexes safe?
It's getting huge and we wanted to cut down on query time. Someone here told me that partitioning it could be bad because Applications doesn't expect that table or its indexes to be partitioned and had queries tuned to query that table as a non-partitioned table with non-partitioned indexes.
Is this true?Hi,
Please also see this related thread.
Partitioning GL_BALANCES
Partitioning GL_BALANCES
Please note that using partitioning requires extra license.
Thanks,
Hussein -
Hi,
if some user dropped some object from database how to know the time of drop for recovery purpose .
can sme one put light on it .
rohitRohit,
This will not be shown to you from any view/etc. You need to mine the redolog/archive files to get the information. For this purpose, Logminer is the tool that you should study. This would bring the DDLs stored in both the files, from where you would come to know the table drop time.
http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/logminer.htm#SUTIL019
HTH
Aman.... -
Table ZL (time processing) in payroll run
Dear Expert,
I have a question regarding processing time data (negative time) without time evaluation. So the time data processing included in payroll (sub schema time data processing). Even though table ZL is being processed but when the payroll run finish its process I cannot find result ZL in time cluster C2. I want to make a report based on this table. The question is how come i can take the result ZL if the situation as stated above?
Super Thanks,
DaveHi Federico,
I totally miss that part. Super thanks for your help.
Best Regards,
David -
Does any one know table names for Payroll Elements that can be used in WF
How can one get the table names for ORACLE Payroll Elements and how can one use those table names in order to create a workflow that will automatically update the payroll balances.
I have created 3 elements for LOAN:-
One being Direct Payment as its Primary Classification---------------->Loan Amount
Two being Voluntary Deductions as its Primary Classification------->Recovery Amount
Three being Information as its Primary Classification--------------------->Reducing Balance.
Would like to create a work flow that will do automatic updation of the DB items in the Balances. So i want to know what all tables get updated once the transaction is made in Payroll. I am not sure if it really can happend.
Please PEOPLE i need some 1 to help me.
Thanking you all in advance,
ChetanTheoretical and acual speeds for all types data connections can differ due to several variables. I cannot tell you if my eSata connection has ever achieved the theoretical maximum speed because I have never measured it. However I have compared transfer times against Firewire with large files and it is noticeably faster (at least by 50%+). Therefore I highly recommend having it if you need to transfer large files from one HDD to another.
If you have a recent MBP, you already have a Thunderbolt connection. I do not believe that there will be a Thunderbolt Express Card for older MBPs because they are not configured to accept that technology.
Ciao. -
How to view if my new job will impact existing tables/jobs/interfaces?
Hi SAP team,
I am planning to implement a job SAPF124 Automatic Clearing to run at 12pm every Thursday.
My question is:
How can I view in the system what is running at this time to see if my run will be conflicting with other transactions/jobs/interfaces/tables/performance, etc....??
Any ideas to pull such report?
Regards,
RogerIt can impact if someone is manually posting documents. Set this up when there are no jobs and no one is using the system.
-
Is there a way to create a temp table each time someone hits a procedure?
I am creating a procedure in Oracle 10g to take data from Oracle to be put in a cognos report. The problem with creating a perminent table is that several people will be running this report at the same time, which means the procedure will be run each time someone runs this report. The perminent table will either drop the old data or the data will get mixed up which means the different peoples reports will not be correct. I tried putting a lock on the tables that would release the lock once one person was done running their report but that didn't work. I suggested we put the data directly into the cursor but what happens when more than one person runs this procedure at the same time?
Any help on figuring out how to resolve this would be greatly appreciated.Hi,
If there is one table 100 people inserting in the same table, i dont see a problem.
Few things you will have to take care of.
When you are retireving the data from teh table to show on the report,
how will you identify it is for which user, obviously you store the user name too.
Now this table will grow in size, so make sure when the report is exited you delete rows from teh table only pertainign to the user.
Over a period of time you will face more problmes of slow response due to continous deletes because of something called as HWM.
So make sure you also gatehr statistics for the table at regular intervals.
So,
1) Store user while isnerting, retrieve data only for that user in teh report
2) delete data on report exit only for that user
3) gather statistics for table at regual intervals.
4) Ofcourse Build indexed for "good" response time, but Benchmark against you data whatever you are doing.Cheers!!!
Bhushan -
How to clear UNB table in the payroll result?
Hi,
We are getting the payroll error- The gross wages do not cover the negative offset that has been forwarded. Therefore, no gross up is permitted while running the Grossup.
I see the below table "UNB table" in the last payroll result. "UNB - Unbalance table used for tax retrocalculation" .
I think we are getting the above error because of this UNB table, can anyone help me to how to clear this table?
When we run the regular payroll, no tax is being deducted.
Please help
SaurabhHi Arti,
Thanks for your reply!!..
Seeing your reply, it gives me bit confidence to crack the existing problem, however, I'm still not clear with your answer... let me explain you the problem-
1. Employee was given 350,000 though Taxable Bonus WT with regular pay check
2. Later it came to know- out of the above amount, 75000 was Grossup amount
3. So in next payroll run, they entered -75000/- in regular Taxable bonus WT and +75000 in Grossup WT and also they deleted one IT210 record of GA tax authority which was wrongly created; and ran the payroll
4. Since then, in next payroll period- No tax is being deducted, so we are creating IT221 infotypes with Tax wagetype
5. If we are running the Grossup wage, payroll error- The gross wages do not cover the negative offset that has been forwarded; therefore, no grossup is permitted.
While running the regular payroll, overpayment Wagetype is being generated.
Now, I saw this UNB table is created and the below wagetype are there in the UNB table-
WT /5UT amount 0.00
WT 5430 amount -75000
WT 0200 amount 12500 ( Monthly salary)
WT 4530 amount 350000
Now I am thinking, if we are deducting tax through IT221,if we clear this UNB table, our problem might get resolved
It will be a great help for me if you can tell-
1. how to check, claims process is implemented or not?
2. I have good HCM experiece but this claims process is very new to me- pls guide me how to do that
Waiting for your reply, thanks in advance
Saurabh Garg -
ADRNR number in BSEC table ( One time customer document )
Hi
We are printing the customer master address details on AR forms with ADRNR field but this logic is not working for one time customer documents in BSEC if ADRNR filed is not available ( BLANK )
The ADRNR filed for one time customers documents in BSEC is not updating always but the same table is getting updated with ADRNR some of the one tim customer document.
I think that the ADRNR filed getting updated if one time customer document are generated from SD & Idoc posting and the ADRBR field not getting updated if one time customer document directly generated from FI side... but i m not 100% clear of ADRNR update in BSEC table for one time customer document
Please help me on this if any body facing some problem earlier.
Thanks
Rishadone
-
GLPCA table fetch - time out issue
Hi,
I have a situation in which one of my business user is facing a TIME OUT in RCOPCA02 report in the below piece of code.
SELECT (tab_fields) FROM glpca
CLIENT SPECIFIED
PACKAGE SIZE packsize
APPENDING CORRESPONDING FIELDS OF TABLE i_glpca
WHERE rldnr IN rldnr
AND rrcty IN rrcty
AND rvers IN rvers
AND kokrs IN kokrs
AND rbukrs IN bukrs
AND ryear IN ryear
AND rassc IN rassc
AND hrkft IN hrkft "note 550972
AND sbukrs IN sbukrs "note 550972
AND shoart IN shoart "note 550972
AND sfarea IN sfarea "note 550972
AND racct IN racct
AND rprctr IN prctr
AND sprctr IN pprctr
AND poper IN poper
AND drcrk IN drcrk
AND activ IN activ
AND rhoart IN rhoart
AND rfarea IN rfarea
AND versa IN versa
AND eprctr IN eprctr
AND afabe IN afabe
AND rmvct IN rmvct
AND docct IN docct
AND docnr IN docnr
AND stagr IN stagr
AND rtcur IN rtcur
AND runit IN runit
AND refdocct IN refdocct
AND refdocnr IN refdocnr
AND werks IN werks
AND rep_matnr IN repmatnr "RDIP40K020663
AND rscope IN rscope "RDIP40K020663
AND rclnt IN rclnt. "dirty trick to mislead DB-optimizer
packsize is set to 1000
But when I try to execute the same using the same variant as used by user I am able to get the output in less than 2 mins. The maximum runtime of a program is set as 900 seconds(which I understand is always same across all users). The issue of TIME_OUT is faced by that particular user only.
My SAP system is SAP_APPL46C. Latest support package implemented is SAPK-91404INDMIS.
What can be the issue? Please suggest.
Regards,
ChinmayYou should run ST05 in parallel with both you and the user with the problem and see what the differences are in the plan.
You cannot get a good answer with the data you have provided.
Please see Please Read before Posting in the Performance and Tuning Forum before posting
Rob
Edited by: Rob Burbank on Sep 30, 2011 9:15 AM
Maybe you are looking for
-
Very Slow VDI Client Login Time
Hi, My environment contains two Hyper-V Servers for DCs, Connection Broker, RD-Web, and two Hyper-V servers as virtualization host to thin clients. All Hyper-V servers are only 35% utilized and all client VMs don't have a performance issue. After set
-
Is there an app where I can monitor my childs texting/instagram?
Is there an iphone app where I can monitor my childs texting/instagram messages?
-
Can anyone help me with sending a form via email to an email recipient? Much appreciated.
-
Re : Read the text of vf03 header
hi i want to read text in vf03 header. To read that text using READ_TEXT i want to consider sales organization distribution channel. From which table i have to check this condition. Thanks mani
-
Hello! I have successfully deployed a jsp application to oas 4.0.8.2 but when I'm trying to insert a new record in greeks(I'm from Greece), when I do commit I get ?????. In the database(8.1.6) it is saved this way too! If I do it from web-to-go web s