Manual Check Numbers in Data Migrations

Hi Experts,
I have to migrate Outgoing Bank payments VIA Check. But all these Checks are manual and have manual check numbers.
From frontend there is a checkbox in payments means window under check tab (Manual) which allows us to enter the manual check numbers if checked.
But the issue is that this field is not exposed in DI API and hence is not available in DTW Template.
As such Iam not able to migrate these manual check number values .
PLZ suggest me some points and suggestions to overcome this problem.
Thanks in advance.

Please see my post of 3/17/2009 re: DTW and Outgoing Payments.
The ManualCheck field doesn't show up when you first open DTW, but it is there and you can map to it in the 'Checks' part of the Vendor Payment DTW destination.
-Cindy Lange

Similar Messages

  • Reuse voided manual check

    Dear,
    I have a problem about the manual check.
    At first I used FCH5 to assign a manual check number to a payment document. But later on I used FCH8 to cancel the payment together with voiding the check number.
    Under such circumstances, I would like to know whether this check number can be re-used to assign to other payment document number, as physically this check doesn't have any problem.
    BR,
    Ivy

    Hello
    Check if this helps:
    Deleting Check Information for Payment Run
    If the print program crashes, the print management file will be incomplete in respect of the following information: The information stored in the check information file up to the time of the crash. The check forms generated by the print program. Since there is also no way of verifying whether the data so far generated is consistent, you have no option but to delete the check information on the payment run.
    To do this, choose Environment ® Check information ® Delete ® For payment run. On the next screen, enter the data for the payment run in question and choose Execute.
    In addition to this, you must also delete from the print management files the print jobs that were generated. After this you can rerun the print program.
    Deleting Information on Manually Created Checks
    You can delete information about a manually created check in the following cases: If you specified an incorrect check number when entering the check number. If a check payment was unexpectedly not made.
    To do this, choose Environment ® Check information ® Delete ® Manual checks.
    Deleting Information on Unused Voided Checks
    If an unused check was incorrectly voided, you must also delete the information stored for this check as follows:
    Choose Environment ® Check information ® Delete ® Voided checks. On the next screen, enter the paying company code and the relevant check numbers.
    Resetting Void Check Data
    If an issued check was mistakenly voided, you can reset the data (void reason code, user, and so on) contained in the check information so that the check is designated as a valid check again and can be cashed by the recipient.
    Choose Environment ® Check information ® Delete ® Reset data.
    Resetting Data on Cashed Checks/Extracts
    If the wrong checks were selected for manual cashing or if an update was carried out when creating an extract, although the file should only be created for test purposes, the incorrect check information can be corrected as follows:
    Choose Environment ® Check information ® Delete ® Reset data.
    Reg
    *assign points if useful

  • Treasury Data Migration

    Friends,
    One of our client has decided to use SAP Treasury and Risk Management. I have been asked to lead data migration within this area. For the same, I would like to know the generally used and needed master and transaction data for SAP Treasury Management. Also, let me know the various ways of migrating data into SAP TRM eg. BAPI's, Transaction Manager, LSMW etc.  
    Regards,
    Amey

    Hello Amey,
    Please find list of steps which might be helpful for data migration.
    Legacy Data Transfer
    1. Legacy data
    transfer always happens on a key date.
    2. The procedure is
    different for securities and OTC because they deal differently with positions
    on a key date.
    3. For securities, we
    simply need to copy the positions with all values as of the key date.
    4. For OTC, the
    entire transaction is copied as there is no explicit position management in the
    transaction manager.
    Customizing the Legacy Data Transfer:
    You primarily need
    customizing for the update types that the flows of the legacy data transfer
    business transactions are to carry. We can find the settings under Treasury and
    Risk Management à Transaction Management à General Settings à Tools à Legacy
    Data Transfer à Flow Data à Update Types.
    The update types are
    divided into those for valuation area independent & dependent. The
    independent one only relates to securities. à While the legacy data transfer is
    usually not intended to update the general ledger, nevertheless you must mark
    most of the update types as posting relevant and maintain an account
    determination in customizing. This is necessary to ensure that the data in the position
    management is complete and in particular the account assignment reference
    transfers can be performed correctly.
    Legacy Data Transfer for OTC:
    Step
    1: Existing OTC transactions enter the same way as new transactions.
    Enter either manually (or) use existing BAPIs to import them. BAPIs list à Use
    BAPI_FTR_FD_CREATE & BAPI_FTR_FXT_CREATE
    for creating FD & FOREX transactions respectively. Similarly there are many
    BAPIs for each product and we can get the list of BAPIs in BAPI transaction code
    under the node Financials à Financial Supply Chain management. Use FTR_BAPI
    Transaction to test run the Treasury BAPIs.We may define separate transaction
    type for legacy transactions so that settlement (or) posting release is not
    necessary.
    Step 2: Now we must change status of all the historical flows
    created in Step 1 to POSTED without actually generating accounting document. To do this we will use transaction TBB1_LC.
    This is similar to transaction TBB1. Unlike TBB1, TBB1_LC affects all the valuation areas without giving an option to affect only operational area. We can see the posted flows by TBB1_LC in posting journal (TPM20) without any reference to accounting documents.
    Step 3: Till now the historical flows do not yet explain the book value of OTC transactions. Hence we must first enter the historical valuation results. This is done using TPM63C which you can find under path Treasury and Risk Management à Transaction Management à General.Settings à Tools à Legacy
    Data Transfer à Flow Data à Enter Valuation Area Dependant Data for MM.After entering the
    valuation results use TPM63 to transfer values to Internal Position Management.
    Legacy Data Transfer for Futures & Listed Options:
    Step 1: Similar to OTC
    Step 2: Similar to OTC
    Step 3: Use transaction TPM63D (Valuation area dependant data for futures). We must establish the relationship between lot & the transaction on which it is based via the transaction number.
    Legacy Data Transfer for Securities:
    For securities position, it does not matter whether is arose in the past through a single purchase (or) a series of business transactions. The legacy data transfer thus seeks to build up the positions through One Business Transaction on the transfer key date.
    Step 1: Before we enter business transactions, we must first create the master data of security position i.e. their Position Indicator. I) Use TPM61A (or) menu path Treasury and Risk Management Transaction
    Management à General Settings  Tools à Legacy Data Transfer à Position Data Enter Position Information for Securities
    II) Now create the position indicator using TPM61.
    Step 2: Now we will enter the business transaction.
      I) Valuation area independent data for securities à For External Position
    II) Valuation area dependant data for securities à For Internal Position Nevertheless, we need to use transaction set the effects of update type on position to change the positions. If you want to enter LOT Positions with the transactions of legacy data transfer, use the Item Number. Ensure that business transaction generating the position and the one for valuation results carry the same item number.
    If you have built up
    lots via normal procedures, you must specify the corresponding transaction number
    when entering the valuation area dependant business transaction in the
    transaction column.
    Step 3: Now generate the business transactions using TPM63.We can initially build only valuation area independent data (external position) by checking “only valuation area independent data” button in TPM63. The external position can be checked in securities account cash flow i.e. external securities position in TPM40.
    Also we created LSMW for each product.
    Regards,
    Jain

  • Recreate post-upgrade deferred data migration job?

    Hello everyone,
    after setting up our new OMS 12c (2-system-approach, i.e. both the old one and new one are running at the moment), I switched a number of test machines to the new OMS12 and ran the accrued and deferred data migration jobs.
    The deferred data migration job "metrics" failed, and - unfortunately not checking the documentation first - I killed the failed job because I hoped this would enable me to restart it.
    Oracle Documentation:
    "To rerun a failed job, click the status icon to reach the Job Run page. On the Job Run page, click Edit. On the Edit <JobName> Job page, click Submit to rerun the job."
    Now I can neither rerun the job nor does the system seem to be capable of recreating it - is there a possiblity to get that job back in there? Or maybe run it at least manually? The collected historical metric data was the one reason we decided to do the 2-system-approach in the first place, and having to go through the whole process of setting up OMS 12 another time seems excessive.
    Every help much appreciated!
    Edited by: user13085691 on Feb 13, 2012 2:50 AM

    Run following as SYSMAN to update the status manually as success in case you have run the job manually and verified as successful.
    Begin
    gc_post_ugc.setPostUgcRepoJob(p_comp_name=>POST_UGC.G_METRIC_COMP,
    p_job_name=>,POST_UGC.METRIC_REPOS_JOB
    p_status=>POST_UGC.G_PASSED_STATUS
    End;
    Please use following process next time, in case you killed the jobs by mistake; following would update the status correctly in Post Upgrade Console.
    ECM Config DDMP job
    If the job is in failed state in Post upgrade Console page, Run the following sql to check if the db session is still running behind.
    SELECT Count(1) FROM v$session WHERE MODULE = 'OracleEnterpriseManager.ConfigDataMigration';
    If the value returned is greater than zero, then db session for the job is still running, Else the session is not running.
    If db session for this job is not running, Follow the steps below to restart the job
    1.delete the job from EM job system
    mgmt_jobs.delete_job('POST_UGC.CONFIG_REPOS_JOB','SYSMAN');
    2. update the status to null
    update EM_POST_UGC_REPO_JOBS set status=NULL WHERE comp_name = 'CONFIG';
    3. start the job from DDMP tab in post upgrade console page.
    Metric DDM Job
    If the job is in failed state in Post upgrade Console page, Run the following sql to check if the db session is still running behind.
    SELECT Count(1) FROM v$session WHERE MODULE = 'OracleEnterpriseManager.MetricDataMigration';
    If the value returned is greater than zero, then job session is running, Else the session is not running.
    If db session for this job is not running, Follow the steps below to restart the job
    1.delete the job from EM job system
    mgmt_jobs.delete_job('POST_UGC.METRIC_REPOS_JOB','SYSMAN');
    2. update the status to null
    update EM_POST_UGC_REPO_JOBS set status=NULL WHERE comp_name = 'METRIC';
    3. start the job from post upgrade console page.
    --------------------------

  • Data Migration for Open Purchase Order

    Hi, All,
    Is there anyone know how to Count the volume for Open Purchase Order. What's the normal strategy for the Data Migration and Cut-over stage?
    My client want to know how many Open Purchase Order in the legacy system and then determine manual or automatic data migration. If manual, how to do? If automatic, how to do? Because all materials and vendors, plants are different number. How to track? How to find out to match between new and old?
    Thank you very much

    JC,
    Sounds a bit early to be making decisions about the realization phase.  It doesn't sound like you have finished the Blueprinting phase yet, much less the testing phase.
    Anyhow, in my experience I typically use LSMW (Legacy system migration workbench) to load MM master data (material masters), Inventory (WIP, RM, FG, etc) Purchasing Master data (Vendors, Purchase Info Records, Source Lists, Quota Arrangements), and Purchasing transactional documents (POs, PurReqs, Scheduling Agreements, etc).  Depending on the complexity and volume of data, it  may be necessary to write custom programs to load the data.  You will find this out during your requirements gathering.
    It is uncommon but possible to load all of these data manually.  I have never run across a client that wants to pay a consultant's hourly rate to sit at a terminal to peck away loading master data, so if the client intends to have his own users enter the data manually, the project manager should make provision that there will be qualified TRAINED client employees available for this data entry.  I did help with a portion of a conversion once manually; of Sales Credits, but there were only about 30 SD docs to load.   I did this the evening before go-live day, while I was waiting for some of my LSMW projects to complete in the background.
    A good opportunity to 'practice' your data loads is right after you have completed your development and customization, and you have gotten the approval from the client to proceed from the pilot build to the full test environment.  Once you have moved your workbench and customization into the client's test environment, but before integration testing, you can mass load all, or a substantial portion of your conversion data into the qual system.  You can treat it like a dry run for go-live, and fine tune your processes, as well as your LSMW projects.
    Yes, it is good practice to generate comparisons between legacy and SAP even if the client doesn't ask for it. For Purchase orders on the SAP side, you could use any of the standard SAP Purchasing reports, such as ME2W, ME2M, ME2C, ME2L, ME2N.  If these reports do not meet the requirements of the client, you could write a query to display the loaded data, or have an ABAPer write a custom report.
    You didn't ask, but you should also do comparisons of ALL loaded data - including master data.
    It sounds like you are implying that the client wants YOU to extract the legacy data.  For an SAP consultant, this is not very realistic (unless the legacy system is another SAP system).  Most of us do not understand the workings of the myriad legacy systems.  The client is usually expected to produce one or more legacy system technical experts for you to liase with.  You normally negotiate with the technical expert about every facet of of the data migration.  In addition, you will liase with business users, who will help you and the implementation team to logically validate that the final solution (turnkey SAP production system, fully loaded with data) will meet the client's business needs.
    Finally, you mentioned how do you track the mapping of master data between legacy and SAP.  There are many ways to do this.  I normally try to get the legacy person do the conversion on his end, eg, when he gives you the load file, you would like to have already translated the master data and inserted the SAP relevant values into the file.  If this is not possible, I usually use MS Access databases to maintain a master map, and I perform the mapping on a PC.  If your data package is small, you can probably get by using MS Excel or similar.
    Good Luck,
    DB49

  • Questions on data migration....

    Hi all,
    I'm currently facing the problem that my colleagues from the data migration team are about to migrate legacy data to the new system using BDC/LSMW.
    1.) Is it true that the system bypasses any checks concerning number ranges (even if the range doesn't exist at all)? Example: They're about to migrate master inspection characteristics. Let's assume that the old MIC had the number 1234 and the new MIC is supposed to have the number 0001_1234. Do I need to create and assign the external number range?
    2.) If I don't need the number range described under 1.) and I do the migration accordingly (again NO number range created and/or assigned). What would happen if I access this MIC in the future and try to use it in an inspection plan. Would the system reject the usage because there's no number range or would it work normally?
    Any help on this issue is greatly appreciated. Please let me know as well, if I need to post it in a different forum.
    Regards,
    Bobby

    Hi Boban,
    Defining of number ranges before you transfer the masters is must. For numeric titles use SAP std number range for Alphabetic one use external range.
    BR
    Shekhar

  • Data Migration from Legacy System in IS-U

    Hi All,
    We are going to Implement a new IS-U project,  I have a problem with LSMW(Legacy System Migration Workbench ),  I have some Conversion Objects of IS-U and I need to know whether Data Migration is possible or not, please tell me how to find these Objects and how to know Data Migration is possible or not.
    Objects are like.,
    1. Accounts
    2. Actuate Reports
    3. Business Partner Relationships
    4. Active Campaigns/Campaign Content/Dispositions
    5. Connection Object
    6. Contacts
    7. Contracts
    8. Opportunities
    9. Payment Arrangement History
    10. Payments
    11. Premises
    12. Rate Changes
    13. Security Deposits
    these are few and there are some more..,
    Thanks in Advance,
    Sai.

    Hi Ram,
    Use Transaction Code EMIGALL. It will ask for company code. By default Company Code is SAP. If you entered with the SAP you will get all the objects.Then goto menu IS-U Migration-->User Handbook. It will give you details Idea.
    Also Check the following Procedure
    You can find detailed documentation of EMIGALL in SAP itself. Use Transaction EQ81 to display it. It provides all the concepts and procedures to work with EMIGALL.
    Here are some points about EMIGALL :
    1. It Migrates data Business Object wise
    2. It uses Direct Input Technique
    3. It has more than 100 objects of IS-U
    and the steps for implementation goes like this:
    1)You have to create a user specially for migration which will have all the authorizations related to migration workbench, BASIS and IS-U
    2)You have to create your own company in EMIGALL. There is a default company called SAP.
    3)Company SAP contains all the Business Objects
    4)You have to figure out what business objects u need and then u have to copy those business objects to ur company from Standard Company SAP
    5)Each objects contains more than one structure and each structure can contain more than one fields. The relation goes like this
    Object ---> Structure ---> Field
    6)You have to define field rules for each required field of the object. You have to mark "Not required" for fields u don't need
    7)After field rules for a given object is set u have to generate load report i.e. actual Direct Input Program which will migrate data. This program is generated on basis of field rules set by u.
    8)After the load report is generated u have to prepare an input file (import File) for migration. The import file should be according to structure provided by SAP and must be in binary format. SAP Provides the structure of file according to your configurations. You have to write ur own Data conversion program(in any language) for this task.
    9)You take import file as input and migrate the data using generated load program
    10)Finally u can check the Migration Statistics and Error Log
    Thanks and Regards,
    Jyotishankar Dutta
    Message was edited by:
            Jyotishankar Dutta
    Message was edited by:
            Jyotishankar Dutta

  • I had a data migration done from old iMac to new iMac, how do I use that file to find my old email contacts, Safari bookmarks, etc ?

    I recently had a data migration from old iMac to new iMac 11.2
    How do I use that data file to get old email contacts, Safari bookmarks, etc ?

    FireWire cables are dirt cheap and that's how you should be doing the migration. You might have to start over, wiping the new machine's HD, and restoring the software as described in the Everything Mac manual that came with it. See Best Practices and Setup new Mac for more details.

  • Issue in data migration of PO contract (Updating table A215)

    Hi  All,
    I doing data migration for PO contract throught BDC, I wanted to update the table A215,
    I can't see the field of A215 on PO contract transaction screen (i.e. Transaction ME33K ).
    Can anybody tell me how to load the data in A215 thorugh BDC.
    Regards,
    Sagar

    How is this A215 table maintained manually? Thru ME33K  or a different transaction like MEK1?
    you are saying you do BDC, are yoou using a SAP given report or did you record the maintenance transaction yourself in SHDB?

  • Data Migration from CSV file to Database Table.

    Hi,
    I have checked few answered threads on Data Migration Issues but didn't find the solution yet.
    I am reading data from an CSV file to internal table but the complete row is coming under one field of the internal table.
    How i can get values in different fields of the internal table from the CSV file ?
    Thanks & Regards.
    Raman Khurana.

    Hi,
    If you are using GUI_UPLOAD, you might have missed to make has_field_separator  as 'X'.
      EXPORTING
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
       filename                      = 'C:\File.csv'
       filetype                      = 'ASC'
       has_field_separator           = ' X'  "<= Set this as 'X'.
      TABLES
        data_tab                      = itab.
    Regards,
    Manoj Kumar P

  • Data Migration From Peoplesoft , JDEdwards To SAP.

    Hi,
    This is kiran here we are doing data Migration work from Peoplesoft And JDEdwards to SAP.in SAP side it involves Master data tables Related to Customer, Vendor, Material. and Meta data tables related to SD, MM, FI. We as SAP Consultant identified Fields from above tables and marked them as Required, Not required, And Mandatory. The Peoplesoft and JDEdwards flocks come up with the same from their side. Then we want map the Fields. as I am new to data Migration any body suggest me what are the steps involves in data Migration How to do Data Mapping in Migration Thanks in advance.
    Thanks
    Kiran.B

    Hi Kiran,
    Good... Check out the following documentation and links
    Migrating from one ERP solution to another is a very complex undertaking. I don't think I would start with comparing data structures. It would be better to understand the business flows you have currently with any unique customizations and determine how these could be implemented in your target ERP. Once this is in place, you can determine the necessary data unload/reload to seed your target system.
    A real configuration of an ERP system will only happen when there is real data in the system. The mapping of legacy system data to a new ERP is a long difficult process, and choices must be made as to what data gets moved and what gets left behind. The only way to verify what you need to actually run in the new ERP environment is to migrate the data over to the ERP development and test environments and test it. The only way to get a smooth transition to a new ERP is to develop processes as automatic as possible to migrate the data from the old system to the new.
    Data loading is not a project that can be done after everything else is ready. Just defining the data in the legacy system is a huge horrible task. Actually mapping it to one of the ERP system schemas is a lesson in pain that must be experienced to be believed.
    The scope of a data migration project is usually a fairly large development process with a lot of proprietary code written to extract legacy data, transform and load the data into the ERP system. This process is usually called ETL (extract, transform, load.)
    How is data put into the ERP?
    There is usually a painfully slow data import facility with most ERP systems. Mashing data into the usually undocumented table schema is also an option, but must be carefully researched. Getting the data out of the legacy systems is usually left to the company buying the ERP. These export - import processes can be complex and slow, sometimes specialized ETL tools can help, sometimes it is easier to use what ever your programmers are familiar with, tools such as C, shell or perl.
    An interesting thing to note is that many bugs and quirks of the old systems will be found when the data is mapped and examined. I am always amazed at what data I find in a legacy system, usually the data has no relational integrity , note that it does not have much more integrity once it is placed in an ERP system so accurate and clean data going in helps to create a system that can work.
    The Business Analysts (BAs) that are good understand the importance of data migration and have an organized plan to migrate the data, allocate resources, give detailed data maps to the migrators (or help create the maps) and give space estimates to the DBAs. Bad BAs can totally fubar the ERP implementation. If the BAs and management cannot fathom that old data must be mapped to the new system, RUN AWAY. The project will fail.
    Check these links
    http://pdf.me.uk/informatica/AAHN/INFDI11.pdf
    http://researchcenter.line56.com/search/keyword/line56/Edwards%20Sap%20Migration%20Solutions/Edwards%20Sap%20Migration%20Solutions
    http://resources.crmbuyer.com/search/keyword/crmbuyer/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration/Scm%20Data%20Migration%20On%20Peoplesoft%20Peoplesoft%20Data%20Migration
    Good Luck and Thanks
    AK

  • SAP Legacy data Migration Starategy/Process

    Hi Friends
    I would request you, if anybody let me knwo the process of legacy data migration into sap. What are steps I have to follow while doing legacy data migration. My questions are:
    1) How to upload GL balances from legacy to SAP
    2) How to upload Vendor and Customer open items into sap
    3) Asset balances How to upload
    4) what is the use of Migration clearing accounts.
    Kindly provide if any documents for legacy data migration into sap.
    Thanks in advance
    Rao

    Dear Rao,
    Just check the bwlow link you can understand easly to upload the balances.
    http://www.saptechies.com/sap-pdf-books-download/SAP_Go_live_strategy11248141773.pdf
    Check the bwlow procedure for the same way do it.
    First create 5 Dummy GL codes. These GL code should be zero at the end of uploading balances.
    1     GL Offset Account               Dr
              Asset Offset Account          Cr
              AP Offset Account          Cr
              Sales          Cr
              Other GL's          Cr
    2     Cash               Dr
         Consumption               Dr
         Asset Offset Account               Dr
         AR Offset Account               Dr
         Material Offset Account               Dr
         Other GL's               Dr
              GL Offset Account     x     Cr
    3     AP Offset Account               Dr
              Vendor 1          Cr
              Vendor 2          Cr
              Vendor 3          Cr
    4     Customer 1               Dr
         Customer 2               Dr
         Customer 3               Dr
              AR Offset Account          Cr
    5     Fixed Asset 1               Dr
         Fixed Asset 2               Dr
              Asset Offset Account          Cr
              Acc. Depreciation Account          Cr
    6     Stock Inventory Account               Dr
              Material Offset Account          Cr
    Regards,

  • SAP Data Migration Framework

    Dear All, I read from TechEd that SAP data migration framework involves using Business Objects data services. SAP has provided pre-built Jobs with iDOC as target.
    All is fine but I came to know that we can use Data services as staging area and modify etc our data in Data services beforw loading into SAP ECC system.
    I would like to kow more about using Data services as staging area. Does that mean that when we run jobs then data is stored in iDOC(XML file) on DS or it means we can run DS in demo mode where we don't load data but keep it within memory of DS. This is different from profiling and storing execution/DQ statistics. This is storing real time data in DS and then modify/analyze and then load in SAP ECC.
    Can someone explain if we can run DS in demo mode without loading data?
    Thanks,

    Hi - you can get more information on using Data Services for data migraiton at http://service.sap.com/bp-datamigration.  Also, check out the blogs:
    /people/ginger.gatling/blog/2010/08/27/data-migration-installing-data-services-and-best-practice-content
    /people/ginger.gatling/blog/2011/03/31/data-migration-in-5-minutes

  • Credit card Open sales order data migration

    Hi All,
    My query for today is to know your experiences as to how to handle legacy data migration to SAP for Open transactions having credit card information. I mean in legacy we have Open orders with credit card information and some of them are approved, or rejected or in a yet to receive a response stage. What is the best practice to migrate these open orders. Should we reauthorize these orders? Similarly, if you can tell me about Open Invoices and Open A/ru2019s.
    Thanks
    AriBis

    hello, friend.
    a standard transaction is VA05 or VA05n to check on open orders. 
    if you need to check the database tables, you can check sales order status/header thru table VBUK.  for item level, use VBUP.  you will find fields such as reference status, delivery status, etc.
    regards.
    jy

  • Data Migration Opening Balances are not updating in the Report F.08

    Hi,
    We have uploaded the GL balances in 2008 for the year 2007. But they are not updating in the F.08 report when i executed for 2008.
    Why they are not updating in the report. In which table can i check the Data migration values in the system?
    Thansk
    Kishore

    Hi We have uploaded all the values in 2008 with 2008 date. fiscal year is K4.
    F.08 Report for the year 2008 is showing only 2008 accumulated values. But not showing the Opening balances which we have uploaded in 2008 for the before years like 2007.
    Why it is showing like that?
    Thansk
    kishore

Maybe you are looking for

  • Best Interface for GarageBand and Logic?

    I have been using GB and Logic Pro extensively for the past 2 years. I would now like to know what the best interface would be to use in order to use 8 mics at one time. Thanks for your help.

  • How do I find out the year of my mac mini and whether it is compatable with ipad or iphone

    I have a mac mini ser#YM5521HBTAA and want to know if it will work with ipad or iphone. I'm not sure of the year purchased. I'm no longer using it and would like to give it to my daughter to use with her Ipad and/or iphone

  • Addition of variant in MB51

    hi all, as we know we can derive the data for material movement with T code mb51 tcode. we have serial number profile assign to some materials. now i want which serial number assign to material w r t mvt type in MB51 tcode only. so can i add a column

  • How do I clean my hard drive to re-sell without a working disc drive?

    I have macbook pro snow leopard. I wanted to clean my hard drive so all my files, informaion, and password get deleted so I can re-sell it. The problem is even though I have the installation disc, my disc drive isn't working... I tried putting my usb

  • Adobe Reader not recognizing tif files

    Why doesn't Adobe Reader recognize tif files?  I created a pdf portfolio using Adobe Acrobat X Pro.  I scanned a black and white document with my Canon flatbed and saved it as a tif file.  I am unable to import the tif file into the pdf portfolio and