Is PI recommended for intial Data-import / Mass-upload to MDM

we have a requirement as below:
there are around 500,000 records in a customer master in SAP R/3.
this data need to be imported into SAP MDM. (after this step, SAP R/3 will be removed/scrapped from the landscape)
after importing to MDM, the records need to be updated to SAP ECC.
so the requirement is something like:
SAP R/3 -
> MDM -
> ECC.
the R/3 to MDM import is one time, initial, activity, to be done for around 5 lac records.
to migrate these much data, is it recommed to use PI, for both inbound and outbound (w.r.t MDM) scenarios .
thanks in advance.
Ganesh

hi,
in general it's not recommended to use PI for mass uploads
but I guess in this case the only issue will be upload to SAP ECC
and not export from old R/3
so I guess you could use PI to export the data from R/3 to a file/files
and use those files in LSMW in ECC
Regards,
Michal Krawczyk

Similar Messages

  • DOCUMENT DATE FOR BACK DATA TO BE UPLOADED

    hello experts,
    I have a purchase cube with TD from last year.
    I have PO NUMBER as a characteristic AND 2LIS_02_SCL as datasource for Purchase.
    now i want the date on which PO is created and there is a DOCUMENT DATE available in the datasource.
    now i have added 0Doc_date as a characterstic to the cube but the date is coming in the report instead # is appearing in Doc date column.
    when i am checking the data in PSA the document date value is available correctly .
    Now please help to load the document data for the back data.
    how to do this so that the Transaction data should not get affected and i get the Document date for the previous PO numbers also.
    plz explain in details.
    will be very thankful

    Hi,
    Delete one Request of that data source from Cube
    and then Reconstruct it.
    check whether the data for document date is coming or not
    if it is coming then delete all the request and reconstruct.
    it will reflect after that.
    Edited by: obaid shaikh on Mar 17, 2011 1:47 PM

  • Create Attachments for Job in PP01 (Mass Upload)

    Dear Experts,
    I have a requirement to attach documents to jobs in PP01.
    I need to do it through program since it is mass upload.
    Could you please suggest me any FMs.
    What will be the object type of Job.
    Regards,
    Srilekha

    Hi Srilekha,
    Maybe bit too late..but can you check this...
    Massive GOS Upload
    Hope this helps.
    Kumarpal

  • Use of ODI in Data Import scenario

    Hi,
    We are contemplating to use ODI for a Data Import scenario. The flow goes something like this:
    1.A user specified flat file or xml with a mapping between the file columns and base table columns serves as the input.
    (Note that the file column names could be anything and that is why we use mapping).
    2. This file data is stored in a stager table and certain validations are run on this data.
    3. This data is then imported into the base tables.
    I assume we cannot use ODI for step 1 as the file columns are not fixed and can vary. We need to programmatically interpret the data from the mappings. (Is there a way to do this in ODI?).
    If we use ODI for step 3 to import data from stager to base tables:
    - If we have a million records to be imported, then how performant is ODI? Do we need to invoke ODI in batches of (a few thousands) to improve performance?
    - Thanks in Advance,
    Raghu

    Hi Jont,
    Thanks for your reply..
    Here is an example of the mapping that we use:
    Flat File columns:
    AccName
    AccLoc
    Mapping (Specified by the user at run time):
    AccName -->(Maps to) Account.Name
    AccLoc --> (Maps to) Account.Location
    The user would map the file columns to the final target entity fields (like Account.Name) as above.
    Since, we have to store this data in a intermediate staging table, we would have a mapping internally (which is fixed), like
    Account.Name -->(maps to) AccStager.Name
    Account.Location -->(Maps to) AccStager.Location
    where AccStager.Name is the staging table field name.
    Thus, by using these two sets of mapping, we store the file data into staging table.
    Hope this is clear...
    Thanks,
    Raghuveer

  • Solution for mass data import to VB02

    hello
    do you have any procedure to mass data import to VB02 exclusion list for ZSD1 3rd list key
    sales dep/ distribution chanel / material / customer
    we don't want to do that manually - more then 600 records
    do you have any idea how to import it - any easier way ?

    HI
    you can import the records eaither LSMW or BAPI which ever your are comfortable to do that.
    in LSMW we have 14 steps , i hope you know .
    thanks
    surya

  • FDMEE Import error "No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'

    Hi,
    We are having trouble while importing one ledger 'GERMANY EUR GGAAP'. It works for Dec 2014 but while trying to import data for 2015 it gives an error.
    Import error shows " RuntimeError: No periods were identified for loading data into table 'AIF_EBS_GL_BALANCES_STG'."
    I tried all Knowledge docs from Oracle support but no luck. Please help us resolving this issue as its occurring in our Production system.
    I also checked all period settings under Data Management> Setup> Integration Setup > Global Mapping and Source Mapping and they all look correct.
    Also its only happening to one ledger rest all ledgers are working fine without any issues.
    Thanks

    Hi,
    there are some Support documents related to this issue.
    I would suggest you have a look to them.
    Regards

  • Exit for Foreign trade data(import tab) on ME22N

    In Purchase Order transaction ME22, the requirement is to copy the contents of office of entry field (EIKP-ZOLLA) to EKKO-INCO2 field.(whenever there is any change in office of entry field).The office of entry field is present in import tab of purchase order screen. Whenever any changes happen in it, the standard PO customer exits like ‘EXIT_SAPMM06E_007’  , EXIT_SAPMM06E_006 does not get triggered  Nor the BADI ‘ME_PROCESS_PO_CUST’. However,  foreign trade exit EXIT_SAPLV50E_005 gets triggered. In this exit only completeness of data is checked and EKKO values cannot be modified from this exit.
    Is there any alternative exit or BADI which gets triggered on the making changes in any data in the import (foreign trade data) tab in PO? Is there any mechanism to change EKKO-INC02 (screen value) from any of the other exits (i.e. exits triggered on data entry in foreign trade tab)?

    Transaction Code - ME22N                    Change Purchase Order
    Exit Name           Description
    LMEDR001            Enhancements to print program
    LMELA002            Adopt batch no. from shipping notification when posting a GR
    LMELA010            Inbound shipping notification: Transfer item data from IDOC
    LMEQR001            User exit for source determination
    LMEXF001            Conditions in Purchasing Documents Without Invoice Receipt
    LWSUS001            Customer-Specific Source Determination in Retail
    M06B0001            Role determination for purchase requisition release
    M06B0002            Changes to comm. structure for purchase requisition release
    M06B0003            Number range and document number
    M06B0004            Number range and document number
    M06B0005            Changes to comm. structure for overall release of requisn.
    M06E0004            Changes to communication structure for release purch. doc.
    M06E0005            Role determination for release of purchasing documents
    ME590001            Grouping of requsitions for PO split in ME59
    MEETA001            Define schedule line type (backlog, immed. req., preview)
    MEFLD004            Determine earliest delivery date f. check w. GR (only PO)
    MELAB001            Gen. forecast delivery schedules: Transfer schedule implem.
    MEQUERY1            Enhancement to Document Overview ME21N/ME51N
    MEVME001            WE default quantity calc. and over/ underdelivery tolerance
    MM06E001            User exits for EDI inbound and outbound purchasing documents
    MM06E003            Number range and document number
    MM06E004            Control import data screens in purchase order
    MM06E005            Customer fields in purchasing document
    MM06E007            Change document for requisitions upon conversion into PO
    MM06E008            Monitoring of contr. target value in case of release orders
    MM06E009            Relevant texts for "Texts exist" indicator
    MM06E010            Field selection for vendor address
    MMAL0001            ALE source list distribution: Outbound processing
    MMAL0002            ALE source list distribution: Inbound processing
    MMAL0003            ALE purcasing info record distribution: Outbound processing
    MMAL0004            ALE purchasing info record distribution: Inbound pro
    MMDA0001            Default delivery addresses
    MMFAB001            User exit for generation of release order
    MRFLB001            Control Items for Contract Release Order
    AMPL0001            User subscreen for additional data on AMPL
    No of Exits:         35
    USER EXIT
    http://www.sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    http://www.sapgenie.com/abap/code/abap26.htm
    http://www.sap-img.com/abap/what-is-user-exits.htm
    http://wiki.ittoolbox.com/index.php/HOWTO:Implement_a_screen_exit_to_a_standard_SAP_transaction
    http://www.easymarketplace.de/userexit.php
    http://www.sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    http://www.sappoint.com/abap/userexit.pdfUser-Exit
    http://www.sap-img.com/ab038.htm
    http://help.sap.com/saphelp_46c/helpdata/en/64/72369adc56d11195100060b03c6b76/frameset.htm
    USER EXIT
    http://www.sap-img.com/abap/a-short-tutorial-on-user-exits.htm
    http://www.sap-img.com/abap/what-is-user-exits.htm
    http://expertanswercenter.techtarget.com/eac/knowledgebaseAnswer/0,295199,sid63_gci982756,00.html
    BAPI
    http://help.sap.com/saphelp_erp2005/helpdata/en/73/7e7941601b1d09e10000000a155106/frameset.htm
    http://support.sas.com/rnd/papers/sugi30/SAP.ppt
    http://www.sts.tu-harburg.de/teaching/sap_r3/ABAP4/abapindx.htm
    http://members.aol.com/_ht_a/skarkada/sap/
    http://www.ct-software.com/reportpool_frame.htm
    http://www.saphelp.com/SAP_Technical.htm
    http://www.kabai.com/abaps/q.htm
    http://www.guidancetech.com/people/holland/sap/abap/
    http://www.planetsap.com/download_abap_programs.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/c8/1975cc43b111d1896f0000e8322d00/content.htm
    /people/thomas.weiss/blog/2006/04/03/how-to-define-a-new-badi-within-the-enhancement-framework--part-3-of-the-series
    /people/thomas.weiss/blog/2006/04/18/how-to-implement-a-badi-and-how-to-use-a-filter--part-4-of-the-series-on-the-new-enhancement-framework
    http://esnips.com/doc/e06e4171-29df-462f-b857-54fac19a9d8e/ppt-on-badis.ppt
    http://esnips.com/doc/43a58f51-5d92-4213-913a-de05e9faac0d/Business-Addin.doc
    http://esnips.com/doc/10016c34-55a7-4b13-8f5f-bf720422d265/BADIs.pdf
    http://esnips.com/doc/1e10392e-64d8-4181-b2a5-5f04d8f87839/badi.doc
    http://esnips.com/doc/365d4c4d-9fcb-4189-85fd-866b7bf25257/customer-exits--badi.zip
    http://esnips.com/doc/3b7bbc09-c095-45a0-9e89-91f2f86ee8e9/BADI-Introduction.ppt
    http://help.sap.com//saphelp_470/helpdata/EN/eb/3e7cee940e11d295df0000e82de14a/frameset.htm
    Rewards if useful.........
    Minal

  • Data import for users of forms created with Livecycle Designer

    Hello,
    I have seen several posts regarding data import for forms created by Livecycle Designer but nothing that helps with something I am trying to accomplish.  I can create a data connection and import information in a form but what I would like to do is import data, then send the pdf for completion to a user.  There are a few data elements that I have available and the rest of the information comes from from the user.  The problem I run into is once I create a data connection, the pdf is ALWAYS looking for the source file for that data.  I simply want to prepopulate some fields and send to the various users for completion.  Any help would be greatly appreciated.
    Thanks!

    Which type of Data Connection are you trying to create?
    XML Schema, Sample Data File or WSDL?
    Creating any one of first two types(mentioned above) will only create schema and will never import any data into PDF.
    If you create the WSDL connection, you can surely import data (i.e. prepopulate data) into your PDF and forward it for users review/fill.
    If I misunderstood your question, please get me clarified.
    Nith

  • Data Import from Excel for Items

    I have used the data import for items, in the past , to import the prices for existing item. I am using this for the first time in SAP version 9.0. There are new fields due to the additional currencies. In an running a test with two items and two price list codes, code 1 and code 10. When I run the import gets stuck in a loop and keeps giving me errors on line 1 with invalid price code.
    I want to import two price lists Price List 1 and Price List 10. For example I want price list 1 to be $10.00 and price lit 10 to be $5.00
    In the import from excel I have the following fields
    A  Item Number
    B  Price List Code
    C  Unit Price - Primary Currency
    D  Primary Currency
    E  Unit Price - Additional Currency 1
    F  Additional Currency 1
    G  Unit Price - Additional Currency 2
    H  Additional Currency 2
    I  UoM Code
    J  Price List Code
    K  Unit Price - Primary Currency
    L  Primary Currency
    M  Unit Price - Additional Currency 1
    N  Additional Currency 1
    O  Unit Price - Additional Currency 2
    P  Additional Currency 2
    Q  UoM Code
    I am using the following columns
    A item number
    B 1
    C 10.00
    J 10
    K 5.00
    This is the way I use this function in SAP versions 2007 and 8.8
    Any suggestions
    Dennis

    Hi Gordon,
    I tried what you suggested, but does not work. I tried with two items, when I run the import SAP gets stuck in a loop displaying error lines. The only way to stop is to restrt SAP.
    I attched a screen shot of the error

  • Automatic import mass data regional structure -  Program RSADRLSM01

    Hello,
    regarding the automatic import mass data to the regional structure via
    program RSADRLSM02, we are working in order to replace our third party
    provider.
    That is why, we need to deleted all the data imported from the city
    file and the references before to import new provider data.
    We have checked the SAP procedure defined in SAP note 132948 and the
    mentioned program RSADRLSM01 but we need confirm that the regional estructure informed in the old documents saved in the system could be impacted if the program RSADRLSM01 is executed.
    Any experience in this kind of process?
    Thanks in advance.
    Juan Carlos

    Since no one has replied - why not just try this in your test system and see what happens?

  • BPC10 - Data manager package for dimension  data export and import

    Dear BPC Expers,
    Need your help.
    I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
    I created a test data manager package from Organize > add package > with  process chain /CPMB/EXPORT_MD_TO_FILE  and Add
    In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
    I have not done any chnages in the script inside the task .
    But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
    I have not changed anything there
    in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
    Not sure how to proceed further.
    I shall be greatfull if someone guide me from your experiance  how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and  output file in the advance tab. how and what  transformation file to be created and link to the data manager package for export / import .
    What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
    Thanks in advance for your guidance.
    Thanks and Regards,
    Ramanuj
    =====================================================================================================
    Detals of the task
    Task : APPL_MD-SOURCE
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : EXPORT_MD_CONVERT
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    Task : FILE_TARGET
    (DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
    (TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    (OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
    (RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
    (%TEMPNO1%,%INCREASENO%)
    (%TEMPNO2%,%INCREASENO%)
    (/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
    (/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
    (/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    (/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
    (/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
    (/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
    (/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
    (/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
    (/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
    ================================================================================

    1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
    2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
    Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
    Cheers,
    Julius

  • Recommended throughput for Oracle data warehouse

    Hi, I know up front this is going to be a vague question...but I'm trying to determine approximate I/O bandwidth for a data mart server. Right now we're hosting 3 or 4 different marts on it, but that number is going to increase.
    Oracle's DW "2 day" class recommends starting with either maximum throughput from user queries, or basing it off of batch windows. Right now the server is barely used for end user queries, as we haven't yet implemented a BI tool to allow users easy access (that's underway right now). So I find it hard to base any info on that. However, it's on the way, and I'm in charge of the BI took (OBIEE). I'm having nightmares that we get OBIEE deployed, and our queries end up taking 5 minutes each to get answers... Right now, on the system basically by myself, if I do a simple "select sum(amount) from fact_ledger", where fact_ledger is a 1 Gb table (with 40 million rows), it takes almost a full minute to run. It feels like I could add this up by hand and get an answer faster...and this certainly doesn't compare with other Oracle marts / DWs I've worked on in the past.
    From a batch window standpoint, all I can say is that it feels really, REALLY too slow to me. Right now, some jobs that start with a 40 million row table and join it to 6 or 7 other small tables (looking up surrogate keys) and writing to a non-logged, non-indexed output table takes over 2 1/2 hours to complete. To me this should be a 15 minute job.
    We've asked IT to do a "root cause analysis" of why performance is so bad - but as part of that, the architecture group wants something more concrete than "it just feels way too slow". So does anyone have some general guidelines they can provide? I guess our detailed info would be:
    - three marts, each of which has a fact table around the 30 - 60 million row level
    - simple "join 30 million row staging table to look up surrogate keys" and writing results is taking 2.5+ hours
    - we expect at some point to have mabe 50 - 100 users running data concurrently (spread across the marts)
    - users will be performance both canned and ad-hoc analysis against it...and they are high level business users, aren't going to be happy with waiting 2 minutes for a simple answer
    My start was to swag this as requiring 6 CPUs or so, which would indicate (according to Oracle's best practice docs) of needing somewhere betweeen 1.2 GB/s to 2.4 GB/s throughput. I'm assuming if it takes almost a full minute to read a 1 GB table, that our IO is currently 60 to 120 times too slow. Does that make sense?
    Thanks and sorry for the lack of details...we just don't know yet.
    Thx,
    Scott

    Why don't you start by taking an AWR report from those two hours so you can see what is the bottleneck for your system ?

  • How is the Cell Boundary Recommendations for 2.4Ghz Data survey in 802.11N

    How is the Cell Boundary Recommendations for 2.4Ghz Data survey in 802.11N?

    bYou can use N on 2.4GHz. You will have to use AES encryption or open SSID, 5.0 GHz channel bonding is to give 40MHz wide channels and double the data rate, plus a little as it can reuse the edge frequecies.
    Basically you have upto MCS15 144Mbps at the top end per channel, don't bond 2.4GHz as there are not enogh non overlapping channels.
    As for the low end it all depends. I would certainly disable the 802.11b data rates as a minimum but that means you still have very large cells and potential for interference. generally I would look at 24Mbps and switch everything off below that especially if you have very few 802.11b clients.

  • Iptc and xmp, data important for organizing photos?

    HI, I keep encountering these two acronyms (IPTC and XMP) while working with my photos in Aperture.  Are these data important, should I be doing anything with it?  Essentially what I am doing is, after importing photos from iPhoto, I am filing to Folders, Projects, and Albums (some Smart). Thanks

    Astechman,
    EXIF (which you didn't ask about) is Exchangable Interchange Format, and those fields are generally physical attributes about your photos, as recorded by your camera.  Things like date/time, aperture, shutter speed, etc.  Some people like to think they should be able to change those, but that doesn't make any sense (except if your camera's clock is wrong.)
    IPTC (International Press Telecommunications Council) data is used in digital media as metadata for the author to put things.  These include things like keywords, location narratives, copyright notice, photographer name.  I.e., things that don't have anything to do with the camera or the physical attributes of the photo, but about the subject/content of the photo or the photographer.
    XMP adds onto IPTC, and is often associated with a "sidecar" file, in which a digital asset manager system (DAMS) saves extra metadata to such a file.  (Aperture is a DAMS, but it does not use sidecar files; it keeps data like that within the library.)
    As for what you should be doing with it -- that's up to you.  How much metadata do you want associated with your photos?  Fill in those fields and just ignore the rest.  I tend to fill in keywords and copyright, and that's about it, but there are many other fields that may be of interest to you.
    nathan

  • Incorrect dates imported for wage display

    Hi All
    I am working on an issue for wrong wages shown in the report, these wages are imported from memory and from DB PCL2, through RT cluster.
    For only a few employees, the date in VERSC, and RT imports previous dates say 2004, even if we are giving it as 2007.
    I checked with the exports and found that program RPCALCU0 exports the values to the Database PCL2.But there as far as I have checked the values for the current period gets exported, as I have passed that in Selection Screen.
    I am not able to find out anything as this is happening with only few PERNRs say 2-3.
    Has anyone faced this problem ever before ? or if you have any suggestions that can help me out, I shall be really grateful.
    Thanks
    Gaurav

    My favorite tool for changing dates is "A Better Finder Attributes".

Maybe you are looking for