Work Schedule in HR Master Data Feeding Work Center!

Hi,
I have several questions on how work schedule in HR master data relates to the work center for capacity planning.
1. How do we assign the employee to a work center? is that in the HR master data or work center or some other transaction?
2. If we can assign an employee to a work center, how the work schedule in the HR master data feeds the work center capacity? Is there any button that I can control whether I want to feed the work center capactiy from HR master data work schedule or from manual input directly to work center?
3. If I can feed the HR master data work schedule to work center, what do I need to do to reflect the vacation time or sick time which is not in the work schedule in the HR master data to the work center?
Thanks,
Ting

Hi,
For starters, please take a look at the following discussion:
Logistics Work Center and HR Work center.
Credits: SDN members
Hope this helps.
Donnie

Similar Messages

  • BEGDA LOGIC IN MASTER DATA?

    We are going live by 14th Aug 2007. My Payroll control record is from 01042007. <b>1)</b>What should be the BEGDA of Work Schedule Rule in master data for IT0007(Planned working time).
    <b>2)</b>Similarly in IT2006 (Absence Quota),Leave Quota Balance for some plant happening on 1st Jan and some other plant by 1st of July. So shall they take Absence quota as per their credit Jan1st 2007 and July1st 2007?
    <b>3</b>In IT 2001 (Absence), we are having four fields:
    <b>PERNR     BEGDA     ENDDA     AWART</b>
    N     N     N     C
    8     8     8     4
    PersNo:     Beg Date End Date    Abs.Type
    000001     01012007  02012007  1020-Casual Leave.
    My doubt here is say they have to maintain Casual leave for an employee 000001 for 2 days from 01012007 to 02012007 they ll be maintaining in above fashion.
    <b>OR</b>
    Can we skip this Infotype 2001 as we are maintaining the absence quota available with employees till July 1st.
    Thanks$Regards,
    Kaushal Rana

    better to make your control record begdate as go live date work schedule rule date may be any date less than or equl to go live date
    2.u can group those two plants differently and use different dates accordingly
    <b>Reward Points if useful</b>

  • BPC Error while loading master data from SAP BW

    Hello Guru,
    I have NW BPC 7.5 installed on BW server. Currently i have started to work on one application where i have to load master data first. I am loading master data for cost center.
    I looked into the process chain "BPC: Import BW InfoObject master data" and i found that few processes are  in GREEN color and few are in RED color (Display). is that correct or not ? then as per some of the thread i went to "BPC: Convert BW infoObject master data" --> SOURCE_TYPE = IP  and added one parameter "FORMULA_FILE_NO". Then i saved this process chain and activated it.
    I created transformation file in BPC and it got created successfully but while i try to load packages it throws me error.
    Q. Do i need to have entire process chain with all processes in GREEN to have the master data in BPC ?
    Q. Do i need to activate any BADI's for this or not ?
    I appreciate your help in advance. I trie

    Hello Guru,
    Currently i am getting below error while loading costcenter master data from BW to BPC.
    Task name MASTER DATA SOURCE:
    Record count: 189
    Task name TEXT SOURCE:
    Record count: 189
    Task name CONVERT:
    No 1 Round:
    Info provider  is not available
    Application: ZRB_SALES_CMB Package status: ERROR
    Anybody can tell me, if i have missed anything ???
    Regards,
    BI NEW
    Edited by: BI  NEW on Feb 23, 2011 12:25 PM

  • Remove 1000 prefix before hier nodenames while loading master data

    Hi experts,
    I am loading the master data for profit center dimension in bpc from 0profit_ctr in bw. The attributes are loading fine but the hier nodenames are starting with 1000 which i want to remove. Unless they are removed, there will also be a problem when i next attempt to load the hierarchy. I used suppresscharacter=1000 before the *Mapping section, but that does not work. Though it shows in the package detail that the records have been loaded but when i check the dimension i find that the Hier nodenames are totally skipped.
    Kindly note, that i used the approach used in the following thread:-
    Conversion: Remove first 4 characters and then ParseINT.
    N.B : the attributes don't have 1000 before them , so they should be loaded as it is, infact they are getting loaded properly.
    I only need to remove the 1000 prefix from Hier nodename. i have also tried using an if statement in mapping section as
    ID(1:4)=*STR(1000) THEN ID(5:14)
    but this approach is also not working as the hier nodenames do not have any fixed length and while validating it fails to get to the end of the record, the 5:14 thing isnt effective probably as the lengths vary.
    Please help.
    Edited by: debchandra on Dec 2, 2011 8:25 AM

    Hi Vishal,
    As i mentioned, SUPPRESSCHARACTER=1000 in the *OPTIONS did not work. it's totally skipping the records beginning with 1000. Note, that i followed the approach as in : Conversion: Remove first 4 characters and then ParseINT.
    Thanks,
    Debayan
    Edited by: debchandra on Dec 5, 2011 11:14 AM

  • Navigation attribute behaviour during time dependent master data read

    Hi All,
    My report is based on infocube. There is one time depenedent master data characteristic named 0EMPLOYEE in the cube.
    My report is run for a specific time period. For example Monthly and quaterly level.
    There are many infoobject form part of dimension in cube and few are part of navigation attribute folder.
    Now in the transformation there are some fields which needs to be read from master data depending upon some time characteristic for ex:-0CALMONTH, 0CALDAY. This rule is mapped correctly and data is also being stored correctly as per the specific time period in the cube.
    Now there are some navigation attirbute which reads the data from 0EMPLOYEE master data itself.
    My doubt is will navigation attribute read the latest record for the employee or is it intelligent enough to read the right records depending upon specific time char?
    With navigation attribute we dont have option to specify any rule as compared to normal objects in transformation.
    What will navigation attribute read ? Latest record or speicific records as per time period?
    Thanks & Regards,
    Anup

    Hi Anup,
    Let me give you one small example about time dependent attribute work,
    let us say we have 0COSTCENTER as time dependent attribute of 0CUSTOMER. Now in your transaction data you have loaded values of 0CUSTOMER and in the query you have used 0CUSTOMER_COSTCENTER attribute.
    Transaction Data,
    Tran. no.      Customer Number       Amount
      1                      123                          500
       2                      125                         450
       3                     126                          900
    Master Data:
    Customer      Cost Center       Valid From             Valid To
      123                COST1                1st Jan                 15th Jan
       123               COST2                 16th Jan              30 March
      123                 COST3                31st March          30 June   
    In the above example the data loaded for valid to and valid from has came from source system and for this data you will have direct mapping done in transformation. Now data will reside as above in your system.
    When you use Key date as 20th Jan then the cost center for customer 123 in the query will be shown as COST2. So this assignment of time dependent data is done at runtime only and will not have any impact on underlying data.
    Regards,
    Durgesh.

  • Master data for finance

    can anyone please tell me what are the master data tables for finance?
    Cheers
    Troy

    Hi Troy,
                 Following tables are IMP from FI/CO point of view.......
    Master data
    SKA1               Accounts
    BNKA                   Bank master record
    Accounting documents // indices
    BKPF               Accounting documents
    BSEG               item level
    BSID                   Accounting: Secondary index for customers          
    BSIK              Accounting: Secondary index for vendors            
    BSIM              Secondary Index, Documents for Material            
    BSIP                  Index for vendor validation of double documents    
    BSIS              Accounting: Secondary index for G/L accounts
    BSAD              Accounting: Index for customers (cleared items)  
    BSAK              Accounting: Index for vendors (cleared items)    
    BSAS             Accounting: Index for G/L accounts (cleared items)
    Payment run
    REGUH          Settlement data from payment program
    REGUP          Processed items from payment program
    CO :
    TKA01               Controlling areas
    TKA02               Controlling area assignment
    KEKO               Product-costing header
    KEPH               Cost components for cost of goods manuf.
    KALO               Costing objects
    KANZ               Sales order items - costing objects
    Cost center master data
    CSKS               Cost Center Master Data
    CSKT               Cost center texts
    CRCO               Assignment of Work Center to Cost Center
    Cost center accounting
    COSP               CO Object: Cost Totals for External Postings
    COEP               CO Object: Line Items (by Period)
    .COBK               CO Object: Document header
    COST               CO Object: Price Totals
    Company code
    T004               Chart of accounts
    BNKA               Master bank data
    T077S               Account group (g/l accounts)
    T009               Fiscal year variants
    T880               Global company data
    T014               Credit control area
    Fi document
    T010O               Posting period variant
    T010P               Posting Period Variant Names
    T001B               Permitted Posting Periods
    T003               Document types
    T012               House banks
    T007a               Tax keys
    T134               Material types
    T179               Materials: Product Hierarchies
    T179T               Materials: Product hierarchies: Texts
    TJ02T               Status text
    TINC               Customer incoterms
    TVFK               Billing doc types
    T390               PM: Shop papers for print control
    Please reward points if helpful........
    Thanks
    Kaushik

  • BPC NW 10.0 - Master data load

    Hey guys have you tried the automatic master data load in BPC 10.0?
    Would it be possible to perform a delta load for master data?
    When you do a delata load whether the exisiting dimensions and properties will be overwritten or just add another column/row with new members and properties?
    Please provide your thoughts.
    Thanks a million

    By automatic do you mean scheduled? If yes, you can schedule the 'Import Master Data Attrib and Text InfoObj' data manager package. The second prompt will ask if the process should 'Overwrite' or 'Update' (you can also modify the advanced script if you want to always update) Depending on what properties are you populating with your transformation file, those will be updated with what's in the source InfoObject. Assume for example the transformation file in the Mapping section is blank, you have ID '123' with description 'Test' in your dimension and ID '123' and description 'Test - changed' in BW, when the update runs the description will be changed from Test to Test - changed. All other IDs will be added to the dimension. Hope this makes sense.

  • SAP BPC 7.5 SP 7 - Master Data Load Detected duplicate member ID

    Hi Gurus I have  a requirement.
    I am loading the Master data for Cost Center - initially while loading master data i didnot load hierarchy. Now I started to load the Master data with Hierarchy but when ever I tried vailidate the transformation file then it is throwing the "Detected Duplicate member "
    Let me show you the transformation - what I wrote there
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    *MAPPING
    ID=ID
    *CONVERSION
    ID=Master_Data_Conversion.xls
    Conversion Master_Data_Conversion.xls has the following code this is to remove the spaces EXTERNALINTERNAL
    *js:%external%.toString().replace(/\s+/g,"")
    Let me show you how I have selected the Data type - Master Data /text from NW BW Infoobject
    Selection of infoobject - 0costcenter
    Format - External Format
    Set Selection - first Tab - Attribute I only wante the Control Area which is 1000 so - Control Area = 1000
    2nd Tab - Hierarchy - Import Text node - yes Hierarchy node - xxxxxxx  Version - Empty Member ID - first member    Level - Blank
    3rd Tab Language - English
    4th Tab Attribute list - Controling Area only is selected
    Note: Let me tell you that when I am loading the Master data with out Hierarchy inthe Set Selection the loading is succesful but when I am doing it with the Hierarchy as mentioned in the 2nd Tab I am getting the error as following.
    Master data (dealt by table level) has errors
    Detected duplicate member ID '201100'
    Also, the Master data for Cost Center in the BW is a time dependent so it is having Valid to and Valid from fields which is not there to be delt in the BPC.
    Please Help

    @Vinay let me tell you when the BW is having the master data which is time dependent then it will have this Duplicate member as we will have duplicate cost center names which are compounded with the Time, so BW will not show an error as there is compounding but BPC do not have that feature.
    The was raised with SAP and they resolved this issue.
    SAP Note 1641529 - Loading Master Data gets duplicate error in special case
    When running Data Manager Package 'Loading Master Data from BW
    InfoObjects', 'Duplicate Members are found' error may be reported in below
    case:
    o In the source BW InfoObjects, there are master data that have
    different lengths, and their IDs are all numeric characters. And
    o If the members are sorted by their lengths first and then IDs gets
    a different order compared with the one if they are sorted directly
    by their IDs. Take member '122' and '1101' for instance. In BW,
    they are sorted as [122, 1102]; if they are sorted directly by the
    IDs, the order is [1102, 122]. And
    o when running the package, the members are in both 'Attribute' and
    'Hierarchy' selection, and the option Filter members by Attributes
    or Hierarchies is used. And
    o Select 'External Format' when running the package.
    Other terms
    DM, Loading Master Data from BW, Duplicate Members
    Reason and Prerequisites
    It's a program error.
    Solution
    Please apply this note or upgrade to SP11.
    I hope this will help you else let me know the entire requirements so that I can provide some kind of assistance.
    Do check how the master data in ur BW is then how the hierarchy nodes and costcenter nodes are there?
    Good Luck
    Vijay Sumith

  • Profit center filed in Fund Center Master data in Budget Control system(BCS

    Hi to all,
    This is client requirement that Profit Center need to be put in Fund Center master Data, I have checked standard SAP field status for Fund center master data but Profit center option is not available there.
    Could some one advise me how to add profit center into Fund Center or is there is Badi or customer enhancements available in SAP.
    Thanks & Regards,
    YK.

    Hi,
    I dont know the reason why the customer wants this but in case the Profit Center of the funds center must be the same as the one available in Cost Center master data they can create a derivation rule in FMDERIVE for this.
    The same in case Profit Center comes from the WBS Element, in order to read the WBS element master data, you will have to
    create a new derivation step calling function module FMDT_READ_MD_WBS_ELEMENT.
    In case Profit Center should come from another place, let me know.
    Just to inform the exit names, in FMMD0009 Customer-Specific Screen Fields for Funds Centers, you have the following exits:
    EXIT_SAPLFMF2_001
    EXIT_SAPLFMF2_002
    I hope this helps.
    Best Regards,
    Vanessa Barth.

  • Delete Cost center master data

    Hi Gurus,
    I am facing issues at the time of delete the cost center master data. some cost center which are not delete & getting the following error:
    Deletion not possible (usage is in table AUFK)
    Deletion not possible (usage is in table ANLZ)
    Deletion not possible (usage is in table COBRB)
    Please help me to workout this issue?
    Warm Regards,
    Rahane D.

    hello  Dhananjay   
    try to find entries maintained with  same cost center which you want to delete using find option .
    sachin
    Edited by: sachin on Oct 28, 2009 7:56 AM

  • Cost Center Master Data Issue for Hierarchy

    Hi,
    We are building a hierarchy on Cost Center and realized that the way Master data on Cost Center is stored in SPM it's getting concatenated with Controlling Area. So CC 1234 becomes 1234_010 where 010 is a controlling area.
    While we understand a need to make CC unique across several controlling area, it's impacting Hierarchy. In Hierarchy we only upload CC and not CA.
    Do you recommend to do concatenation of CC & CA for Hierarchy as well so that it matches with Master? Or is there any other recommended way to go ?
    Regards

    Hi Rohit,
    That was helpful. We will certainly take concatenation route.
    However in Invoice Extract File we do get CC & CA separately in XARCOCTR & XSACOARR fields respectively.
    When you say that this is concatenated even on Transaction Data are you talking about data AFTER it is loaded into SPM ? Also when data is sent out to DSE will it have CC & CA concatenated ?
    Regards,

  • Master record for cost center/activity type

    Hi all;
    How create a master record between an cost center and activity type?
    Thank you

    Hi
    You can create master data between cost center / activity type.
    You can link the two by planning for activity qty the combination of a cost center / activity type in KP26. Then you do a detailed planning for costs for the cost center / activity type / cost element in KP06.
    I hope it is clear..
    Regards,
    Suraj

  • PowerPivot in SharePoint 2010 - Refresh Excel with Data Feed does not work

    Dear all,
    I created a PowerPivot chart out of a SharePoint exported list to Data Feed.
    I then published it to a trusted Document Library.
    The Chart is working well but is not updating.
    So if I go to the document library and on the drop down meny of the publsihed excel file I choose Manage PowerPivot Data Refresh and force it to refresh, it fails with the following error message>
    Errors in the high-level relational engine. The following exception occurred while the managed IDbConnection interface was being used: The remote server returned an error: (401) Unauthorized.. A connection could not be made to the data source with the DataSourceID
    of '3b4d4c28-909c-47d3-b4d6-07684f5e2ee9', Name of 'DataFeed mywebapp.domain TestPowerPivotDataFeed'. An error occurred while processing the 'testPowerPivot' table. The operation has been cancelled.
    On the SQL Server I ran a profiler and got the follwoing:
    exec [DataRefresh].[AddRunDetails] @RunID=54,@DataSourceID=N'3b4d4c28-909c-47d3-b4d6-07684f5e2ee9',@FriendlyName=N'DataFeed mywebapp.domain.local TestPowerPivotDataFeed',@Source=N'http://portal.gonzofish.local',@Provider=N'Microsoft.Data.DataFeedClient',@Catalog=N'dev/scrum/Data
    Feed Library/TestPowerPivotDataFeed.atomsvc',@ConnectionString=N'Data Source=http://mywebapp.domain.local/dev/sc/Data%20Feed%20Library/TestPowerPivotDataFeed.atomsvc;Integrated Security=SSPI;Persist Security Info=false;Namespaces to Include=*;Service Document
    Url=http://mywebapp.domain.local/dev/sc/Data%20Feed%20Library/TestPowerPivotDataFeed.atomsvc',@Result=N'F',@RunStartTime='2014-06-04 15:33:04.590',@RunEndTime='2014-06-04 15:33:04.727',@Comments=N'Errors in the high-level relational engine. The following exception
    occurred while the managed IDbConnection interface was being used: The remote server returned an error: (401) Unauthorized..
    A connection could not be made to the data source with the DataSourceID of ''3b4d4c28-909c-47d3-b4d6-07684f5e2ee9'', Name of ''DataFeed mywebapp.domain.local TestPowerPivotDataFeed''.
    An error occurred while processing the ''testPowerPivot'' table.
    The operation has been cancelled.
    I followed the link http://technet.microsoft.com/en-us/library/hh487291%28v=office.14%29.aspx to set up the Data Refresh for the PowerPivot.
    Thanks in advance.
    Regards,
    Gonçalo
    Gonçalo

    Hi Gonçalo,
    The error message shows that the account didn't have sufficient permission access to the data source(SharePoint List data feed). PowerPivot data refresh is performed by Analysis Services server instances in the SharePoint farm, we may need to grant the
    account running SQL Server Analysis Services (POWERPIVOT)
    Read permissions to the SharePoint list we are attempting to refresh.
    In addition, the external data sources that are accessed during data refresh must be available and the credentials you specify in the schedule must have permission to access those data sources.
    For more information, please take a look at the following article:
    Schedule a Data Refresh:
    http://technet.microsoft.com/en-us/library/ee210651(v=sql.110).aspx
    Hope this helps.
    Elvis Long
    TechNet Community Support

  • Abap code not working  - deleting based on master data table information

    Hi,
    I wrote a piece of code earlier which is working and during test we found out that it will be hard for the support guys to maintain because it was hard coded and there is possibility that users will include more code nums in the future
    sample code
    DELETE it_source WHERE /M/SOURCE EQ 'USA' AND
    /M/CODENUM NE '0999' AND
    /MCODENUM NE '0888' AND.
    Now I created a new InfoObject master data so that the support people can maintain the source and code number manually.
    master data table - the codenum is the key.
    XCODENUM    XSOURCE
    0999               IND01
    0888               IND01
    now I wrote this routine all the data gets deleted.
    tables /M/PGICTABLE.
    Data tab like /M/PGICTABLE occurs 0 with header line.
    Select * from /M/PGICTABLE into table tab where objvers = 'A'.
    if sy-subrc = 0.
    LOOP at tab.
    DELETE it_source WHERE /M/SOURCE EQ tab-XSOURCE AND /M/CODENUM NE tab-XCODENUM.
    ENDLOOP.
    Endif.
    But when I chage the sign to EQ, I get opposite values , Not what I require.
    DELETE it_source WHERE /M/SOURCE EQ tab-XSOURCE AND /M/CODENUM EQ tab-XCODENUM.
    Cube table that I want to extract from
    /M/SOURCE                             /M/CODENUM
    IND01                                       0999
    IND01                                       0888
    IND01                                       0555
    IND01                                       0444
    FRF01                                      0111
    I want to only the rows where the /M/CODENUM = 0999 and 0888 and i would also need FRF101
    and the rows in bold  should be deleted.
    thanks
    Edited by: Bhat Vaidya on Jun 17, 2010 12:38 PM

    It's obvious why it deletes all the records. Debug & get your answer i wont spoon feed
    Anyways on to achieve your requirement try this code:
    DATA:
          r_srce TYPE RANGE OF char5, "Range Table for Source
          s_srce LIKE LINE OF r_srce,
          r_code TYPE RANGE OF numc04,"Range table for Code
          s_code LIKE LINE OF r_code.
    s_srce-sign = s_code-sign = 'I'.
    s_srce-option = s_code-option = 'EQ'.
    * Populate the range tables using /M/PGICTABLE
    LOOP AT itab INTO wa.
      s_code-low = wa1-code.
      s_srce-low = wa1-srce.
      APPEND: s_code TO r_code,
              s_srce TO r_srce.
    ENDLOOP.
    DELETE ADJACENT DUPLICATES FROM:
    r_code COMPARING ALL FIELDS,
    r_srce COMPARING ALL FIELDS.
    * Delete from Cube
    DELETE it_source WHERE srce IN r_srce AND code IN r_code.

  • Extraction of certain master data not working

    Hi, we are on BI7 and some of the BC masterdata extractions are not working correctly.  Some masterdata working fine, consequently communication between systems working, but some master data fields e.g. 0co_area, 0chrt_accts and more are not working fine.  Get this on 3.x and new datasources.
    The data does get pulled through but the upload just never goes green, and keeps running.  If one look at the batch jobs (SM37) in the appropriate source system, it keeps on going until it is manually stopped. 
    I am only experiencing this problem with data pulled through from the production system - I am pulling it through to my testing and production box, but fails in both, where all other ECC systems are fine.
    In the monitor there is a warning and the following is the details of the monitor: 
    Overall status: Missing Messages or warnings
       Requests (messages):  Everything OK  (green)       Data request arranged  (green)
           Confirmed with:  OK (green)
       Extraction (messages): Missing messages (yellow)
           Data request received (green)
           Data selection scheduled (green)
           Missing message:  Number of sent records (yellow)
           Missing message: Selection completed (yellow)
       Transfer (IDocs and TRFC): Missing messages or warnings (yellow)
           Request IDoc: Application document posted (green)
           Info IDoc 1 : Application document posted (green)
           Data Package 1 : arrived in BW ; Processing : Selected number does not agree with transferred n (yellow)
           Info IDoc 2 : Application document posted (green)
       Processing (data packet) : Everyting OK (green)
           Inboun Processing (32 records) : No errors (green)
           Update PSA (32 records posted) : No errors (green)
           Processing end : No errors (green)

    Thanks, for the responses.  The appropriate Idocs work fine with transaction BD87. I looked in SM58 earlier during the course of the day and there was no error, just checked again (8PM) and found some entries with error texts for BW background user - 6 errors within 10 seconds.  Something to do with 'ERROR REQU... PG# IN 1' - do not have authorisation at this moment to check the specific requests, but not sure if the master data extractions would have caused these.
    In the job log for the problematic jobs, only 1 IDoc get created, unlike the other succesful ones that have 3 IDocs (not sure if this means anything)

Maybe you are looking for

  • Pdf or powerpoint

    Can a downloadable pdf or powerpoint presentation be place onto a DVD in DVDSP? If so how please? Thanks in advance

  • LabVIEW and Vision (8.2) 2D FFT - Some differences noted

    Hello: I am attaching a small llb showing the 2d FFT results obtained using both the LabVIEW and Vision (8.2) functions.  Some differences in results are seen.  The Vision results seem to have more components (move the scroll bars to get to approxima

  • Multicast address inside a solaris zone buggy ?

    I need to post to a multicast address from within a solaris 10 (06.06) zone but the routing is abnormal: On the global: 224.0.0.0 240.0.0.0 172.16.248.42 e1000g0 1500* 0 1 U 0 0 Inside the zone: 224.0.0.0 240.0.0.0 172.16.0.60 e1000g32001:1 1500* 0 1

  • Slow Apple TV

    Hi! My Apple TV is suddenly really slow. Every thing worked fine up until a month ago or so. When I try to airplay a movie, (from iPhone, iPad, iMac) it takes for ever to load. And when I mirror my iMac through airplay to my tv, it lags big time. Als

  • Where are all the older posts?

    Sorry to put a non-interesting post up here, but where are all the old posts. I am looking for the post in regards to saving computation time with passing byte arrays back and forth between AS and C. This was in a previous post, but I notice that pos