ODBC to export SAGE Line 100 Legacy Data

Hi guys,
I need an ODBC for the SAGE Line 100 to export legacy data into excel to do migrations using DTW.
The customer cant find the origanal CD's and I cant wait for it.
Does any1 know where I can get it?
Thanks

Hi Noor
Have a look at the following link:
http://www.pvx.com/downloads/#download-odbc-windows
Kind regards
Peter Juby

Similar Messages

  • GRs created for schedule line of future date

    While making Goods Receipts using MIGO transaction, we are seeing tha GRs can be done for scheduling agreements having schedule line of future dates. Is there some way of restricting the GR upto to current date only. Meaning suppose if I have the following schedule lines:
    category of delivery date         delivery-date           schedule quantity
    M                                               01.2010                   100
    M                                               02.2010                    200
    M                                               03.2010                    150
    Currently the system is allowing to do GRs of quantity = 450 (100 + 200 + 150) in the month of February, 2010.
    But I want that while doing a GR in February, 2010 the system should allow to do GR of quantity 300 only (100 + 200). Please let me know how we can do so in the system.
    Regards,
    Pratima.

    Dear Veer,
    Your answer has helped in the MIGO transaction. Please let me know if something similar is possible for inbound deliveries also. Meaning like in MIGO transaction, I should not be able to make inbound deliveries (in xns VL31N) for schedule lines of future dates / months.
    Regards,
    Pratima.

  • Exporting whole database (10GB) using Data Pump export utility

    Hi,
    I have a requirement that we have to export the whole database (10GB) using Data Pump export utility because it is not possible to send the 10GB dump in a CD/DVD to the system vendor of our application (to analyze few issues we have).
    Now when i checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use normal export method. Also, will data pump reduce the size of the dump file so it can fit in a DVD or can we use Parallel Full DB export utility to split the files and include them in a DVD, is it possible.
    Please correct me if i am wrong and kindly help.
    Thanks for your help in advance.

    You need to create a directory object.
    sqlplus user/password
    create directory foo as '/path_here';
    grant all on directory foo to public;
    exit;
    then run you expdp command.
    Data Pump can compress the dumpfile if you are on 11.1 and have the appropriate options. The reason for saying filesize is to limit the size of the dumpfile. If you have 10G and are not compressing and the total dumpfiles are 10G, then by specifying 600MB, you will just have 10G/600MB = 17 dumpfiles that are 600MB. You will have to send them 17 cds. (probably a few more if dumpfiles don't get filled up 100% due to parallel.
    Data Pump dumpfiles are written by the server, not the client, so the dumpfiles don't get created in the directory where the job is run.
    Dean

  • Asset-Building legacy data

    Dear all
    There is scenario for Asset-Building in legacy data......for building there are many line items...for e.g.
    1. Building is  purchased
    2. then some addition in this asset(with other date)
    etc.
    Now how to upload this assets in the system.....
    regards

    Hi,
    Now i got confuse..
    You want to upload the asset data in SAP form your Legacy system... Correct
    If it is so then you have 2 option
    1. you can upload the asset in SAP by creating different Asset code base on different acquisition made in same asset in your legacy.and upload the dep differently.
    2. You consolidate the value of all the year and upload in one asset in sap
    Total Acquisition value and total Dep on all the asset
    Regards,
    Shayam

  • Upload 3 year old legacy data into ECC

    I need to upload legacy data (purchase orders/FI line items from accounting entries) of 2007/8/9 into ECC today through LSMW. Can i do this?Do I need to open posting periods of 2007/8/9. How to do that?
    Does anything have to be done using MMPV or MMRV??

    Hi,
    No this is not possible/recommended.  The recommended practice is to upload trial balance as on the cutover date. 
    It means...
    1) If the cutover date is similar to year end, then only balance sheet items which are open items as on that date
    2) If the cutover date is other than year end date, then all open items of balance sheet accounts and year to date balance of profit and loss account.
    Best Regards,
    Madhu

  • Creating a report of all the errors occured while loading legacy data

    hi  guys,
    i am using Bapi to load legacy data .
    how can i  list all the errors that occur during the transfer .
    i want to see  all the errors that occured and  create a report .
    thanks .

    Hi look at this code... you will get an idea
    CALL FUNCTION 'BAPI_BUPA_FS_CREATE_FROM_DATA2'
        EXPORTING
    *   BUSINESSPARTNEREXTERN              =
          partnercategory                  = c_2
          partnergroup                     = c_rp
          centraldata                      = wa_centraldata
        IMPORTING
          businesspartner                  = w_partner
       TABLES
          return                           = it_return.
    * Check for errors
      CLEAR wa_return.
      READ TABLE it_return INTO wa_return WITH KEY type = c_e.
      IF sy-subrc EQ 0.
        CALL FUNCTION 'FORMAT_MESSAGE'
          EXPORTING
            id        = wa_return-id
            lang      = sy-langu
            no        = wa_return-number
            v1        = wa_return-message_v1
            v2        = wa_return-message_v2
            v3        = wa_return-message_v3
            v4        = wa_return-message_v4
          IMPORTING
            msg       = wa_return-message
          EXCEPTIONS
            not_found = 1
            OTHERS    = 2.
      ENDIF.                               " IF sy-subrc EQ 0

  • Is there an app for BC sage line 50 integration

    Is there an app for BC sage line 50 integration so visitors can draw down invoices and elivery notes onto theire web area

    I am really confused, not at this stuff but at what your saying and not reading.
    I am not sure why your not reading things:
    You seem to think I am disagreeing with you on some things where I am not. your glossing over the things I am saying and I think you do not realise a number of things about open platform.
    In terms of SOAP TheBCMan, your actually incorrect on the authentication and just doing an app using username and password creates a big problem in the scope of an app.
    In terms of email and password - This has to be a user. So what your saying in terms of an app that may be on the app store etc - Someone has to create a user access or configure the app with their details before it will work. This is only viable for a one off app and custom solution for 1 site and either requiring a user slot for an app or needing one created.
    This option is not really viable if your creating a true open platform app and has those problems and overkill in terms of your code access.
    Developer reference
    "To use a site token instead of username/password, send an empty username field and the site token as the password. See example below."
    This works great, and you can see it is right there in the documentation. So there is a token.
    What this means for your app, and the reason this is here is that you can create an app through SOAP (and again, we got apps out there doing this! lol) and NOT require a username and password. You do not have to request it in your app, no manual setup, no need to have a user or ensure a user exists in an app, works with any user, changes to users in the admin etc.
    This means that the app can be sold viable to multiple people as a proper product, its less code and no use requirements.
    - Liam is confusing a REST API with the SOAP XML API.
    No, I know what is REST and what is SOAP, I also have been talking BC about open platform testing it etc, and for some time. Also helped with feedback on what is more important to turn into REST. BC plan to do so as I mentioned with those conversations.
    What you are not reading and understanding is what works in terms of open platform and between and app and 3rd party server data which is what I mentioned.
    - Even if you needed an authentication token (which you do not) it still would be possible to use Javascript with said token.
    Yep, as I mentioned, and Yes, in terms of viable apps on sale you need to use the authentication token, not username and password option - Thats a bad idea and not a viable selling app. And it is more straight forward with the authentication token, so your also over complicating things.
    - YOU CAN with Javascript in a BC app (admin area) access the full SOAP XML API. NOTE: You need to put usernames and passwords in the JavaScript so there is that security consideration.
    No need for username and password, and as I keep saying which your ignoring, I am 1. not talking about this in most of the bits I have said and 2. WE DO THIS, we were the first having this working in open platform (Which has been available to us through alpha as partner advisory board partners)
    - YOU CAN access the API with plain old HTML and JS from the front end of the site (in this case ONLY a 3rd party middle man server needed because I would never put login credentials client side).
    Of course, again Pretty offers things like email validation to more complex things, BUT to be secure this needs to be done properly with good authentication. Like any decent web application and their API access etc. Your handling personal data on BC sites in the CRM etc, not doing this properly would be lazy and stupid as I am sure you agree. But based on your comments you can give people the wrong idea about open platform and apps.
    - YOU ARE / I AM NOT breaking any rules or guidelines set out by BC or adobe because you are using plain old Javascript and HTML.
    Not true, I have done lots of things, and provided feedback on what I have done with BC, checked with them and been advised what is valid, what is not. When I was accessing lots of elements of the admin, as I mentioned already here along with other things.. this and other reasons were the reasons why BC introduced apps under a different domain and iframe so they could not access the parent frames dom objects etc.
    Because it is HTML and javascript it is also NOT true that you do not have to follow any guidelines or rules, yes you do. From HTML apps with other platforms like windows you can use javascript and HTML but there is authentication, methods etc. BC has already introduced this, apps are already sandboxed and the security features as I have already mentioned will be increased. Also, because these are built apps, you can not just go in and rip peoples code. Unlike a web app where the source is available, while you may view an app, if BC or the BC app store file code ripped from other apps, this breaks that apps copyright and you will be liable.
    It is early days for open platform but from ownership to security and data access they are very much in the scope of app and are/will and continue to be in that scope. Like I said, if your hacking Webapp API now outside of an app - BC do not want you to do it and will close the loop holes.
    Addition: As another note, You of course, can not just make an app and it will work. You have to register the app with BC, You have to declare your development site, note when the app is live so it can be deployed. Apps as well fall under security access limitations. You can not just deploy an app on a site run by another partner, neither you or the site owner can validate the app, only the other partner. This is to ensure that it is not dodgy.
    The BCappstore has spent a lot of time and money and have access API's via BC to authenticate and deploy apps properly as well. If nothing of what I was saying was correct, none of this would be in place.
    - Security is my number 1 focus on any project every time, I would not compromise security of any development work I did for easy of use or a cool idea. I would turn away work before compromising yours or my security on a project.
    That is great to here, I do not expect or consider any different from someone with the experience you have. You should fully well understand that despite it being Javascript and HTML or because of It as well in other regards, BC are working to introduce increased security to open platform as they already have from the first iterations. I am not sure why your arguing me on this if this is how you feel?
    I am still not sure what your having ago at me for.

  • Please send detail steps for uploading legacy data

    Hi friends,
    please send detail steps for uploading legacy data
    Thanking u in advance,
    Diwa.

    HI U CAN USE LSMW TO UPLOAD LEGACY DATA
    LSMW is used for migrating data from a legacy system to SAP system, or from one SAP system to another.
    Apart from standard batch/direct input and recordings, BAPI and IDocs are available as additional import methods for processing the legacy data.
    The LSMW comprises the following main steps:
    Read data (legacy data in spreadsheet tables and/or sequential files).
    Convert data (from the source into the target format).
    Import data (to the database used by the R/3 application.
    But, before these steps, you need to perform following steps :
    Define source structure : structure of data in the source file.
    Define target structure : structure of SAP that receives data.
    Field mapping: Mapping between the source and target structure with conversions, if any.
    Specify file: location of the source file
    Of all the methods used for data migration like BDC, LSMW , Call Transaction which one is used most of the time?
    How is the decision made which method should be followed? What is the procedure followed for this analysis?
    All the 3 methods are used to migrate data. Selection of these methods depends on the scenario, amount of data need to transfer. LSMW is a ready  tool provided by SAP and you have to follow some 17 steps to migrate master data. While in BDCs Session method is the better choice because of some advantages over call transaction. But call transaction is also very useful to do immediate updation of small amout of data. (In call transaction developer has to handle errors).
    SO Bottom line is make choice of these methods based of real time requirements.
    These methods are chosen completely based on situation you are in. Direct input method is not available for all scenario, else, they are the simplest ones. In batch input method ,you need to do recording for the transaction concerned. Similarly, IDoc, and BAPI are there, and use of these need to be decided based on the requirement.
    Try to go through the some material on these four methods, and implement them.  You will then have a fair idea about when to use which.
    LSMW Steps For Data Migration
    How to develop a lsmw for data migration for va01 or xk01 transaction?
    You can create lsmw for data migration as follows (using session method):
    Example for xk01 (create vendor)
    Initially there will be 20 steps but after processing 1 step it will reduced to 14 for session method.
    1. TCode : LSMW.
    2. Enter Project name, sub project name and object name.
        Execute.
    3. Maintain object attributes.
        Execute
        select Batch Input recording
        goto->Recording overview
        create
        recording name.
        enter transaction code.
        start recording
        do recording as per ur choice.
        save + back.
        enter recording name in lsmw screen.
        save + back
    Now there will be 14 steps.
    2. MAINTAIN SOURCE STRUCTURES.
        Here you have  to enter the name of internal table.
        display change
        create
        save + back
    3. MAINTAIN SOURCE FIELDS.
        display change
        select structure
        source_fields->copy fields.
        a dialogue window will come .
        select -> from data file
        apply source fields
        enter No. of fields
        length of fields
        attach file
        save + back
    4. MAINTAIN STRUCTURE RELATIONS
        display change
        save + back
    5. MAINTAN FIELD MAPPING & CONVERSION RULE
        display change
        click on source field, select exact field from structue and enter
        repeat these steps for all fields.
        save+back
    6. MAINTAIN FIXED VALUES, TRANSACTION, USER DEFINED
        execute
        save + back
    7. SPECIFY FILES.
        display change
        click on legacy data
        attah flat file
        give description
        select tabulatore
        enter
        save + back
    8. ASSIGN FILE
        execute
        display  change
        save + back
    9. IMPORT DATA.
        execute
        display  change
        save + back
    10. DISPLAY IMPORTED DATA
          enter ok, it willl show records only.
          back
    11. CONVERT DATA
          execute
          display  change
          save + back
    12. DISPLAY CONVERTED DATA
          execute
          display  change
          save + back
    13. CREATE BATCH INPUT SESSION
          tick keep batch input folder
          F8
          back
    14. RUN BATCH INPUT SESSION.
          sm35 will come
          Object name will be shown here
          select object & process

  • How to migrate legacy data from XI 2.0 to 3.0 ?!

    Hi,
    last year we had migrated from XI 2.0 to 3.0. We set up a totally new system and I reimplemented the scenarios (to correct some unhappy solutions of the past).
    Now we want to delete the old XI System but before we have to save the legacy data.
    Is there a possibility to bring the old date to the XI 3.0?
    Thank you for your help
    Thomas

    Wow, no std way I guess.
    But I think that with a 5 days ABAP coding you could have it done.
    Basically messages are all stored in ABAP stack tables, and fully managed by ABAP classes. Take a look (with SE24) at CL_XMS_MAIN and CL_XMS_PERSIST. You could be able to write an ABAP program in your XI 2.0 box that recursivley reads all messages you want (copy RSXMB_SELECT_MESSAGES report, which is very good for selection, and comment out the call to screen 100; you'll have selected message in an internal table), serialize them (there should be a method for that, can't find it now), send to XI 3.0 via RFC in a fm that deserialize, create a new message object and commit to the db.
    That's the best (and creative) way I can think of.
    But I would first try what Stefan tells.
    Good luck.
    Alex
    Message was edited by: Alessandro Guarneri

  • Legacy data into new UCM Solution

    we are implementing new UCM solution to one of my client,
    Client is having lots of legacy data(>2TB), and they want to move the complete data into new UCM Solution?
    My question is, what is the way to convert the whole data in to UCM Content server?

    I assume that you are aware of the productised UCM Siebel integration...are you wanting to load legacy data and make it available through this? Even if you are the approach is much the same.
    OK so if on a file system then what you need to do is work out how you can derive metadata either from the directory structure OR the Siebel DB or the content itself - those are in order of difficulty and time taken I would say.
    The simplest thing to do is to use this information to create batchloader scripts. Why not try and create some test scripts where you load in 100 items or similar and see if the results/performance are aceptable.
    It will take a long time to load 2TB of data - how many separate files is this? So you may need to devise a schedule for doing this in a number of smaller migration runs.
    Tim

  • Transaction type for the legacy data for Auc

    Dear Experts,
    When i use the AS91 to load the legacy for Auc, the system issues the error ' Trans. type 900 can only be used for legacy assets from prev. years'
    The asset value data is '2011/04/05', the transaction type is 900. The asset transfer data is '2011/04/25'
    Should I have to use the transaction for the Auc's legacy data input? Can i choose other transaction tpye like 100?
    Br
    Sophie

    Hi
    100 is used for uploading Non AUC assets
    900 is used for AUC.. AS per SAP recommendation, 01/01/20xx to be used when uploading AUC... I misplaced that SAP note which says so
    Normally changing the asset val date impacts the dep calculation.. Since, AUC has 0000 dep key, it does not impact Dep calculation
    br, Ajay M

  • Asset depreciation calculation for prevous year  when legacy data transfere

    Hello,
    Asset Legacy Data transfer has made as on 31.12.2006. so, for the first period asset depreciation calculation has done wrongly  i.e. on the first period of 2007. but it was not noticed at that point of time, now after 2 years, the cumulative difference is coming out very big.
    for 1st period, in Asset Value Tab of AS03,  for period 01 , 2007 when i am checking right now it shows for dep. Area 01 right now that Amt to be post (14.48) instead of that in the next line for Dep. area 20 and in the next line it shows, Amt. to be posted ( -0.28). hence, for next period only differnce amount is posted in G/L side instead of actual it should be posted as 14.76.
    so, now because of this error, my G/L and AA sides are not matching up.
    eventhough i post manual reversal entry via FB01, when i will run ABST2 there will be problem.
    if i run manual unplanned depreciation, then will my problem of not matching of Asset side and GL side will be solved. ?
    also, please provide suggestion from your end in order to solve this issue.
    Thanking you for your help.
    Jigs.

    We cannot post any depreciation to previous fiscal year 2008 as no planned depreciation is envisaged for the previous year 2008. The transfer date for Legacy asset data transfer is maintained as 12/31/2008 and takeover values are uploaded accordingly. Hence, 2009 is a first year for Asset Accounting of the company code.
    However, Depreciation posting program checks for previous period log while executing for the current period. The logs are stored in u201CTABAu201D table. For new company code there will not be any entries in TABA table.
    Due to missing functionality, System is checking for previous year depreciation posting logs even though the current fiscal year is first year for planned depreciation and giving error message AA 687.
    As per SAP Note 144441 this error can be rectified with the help of correction program ZACORR_TABA_ENTRY_CREATE.
    Steps to Rectify the Error:
    1. Copy correction program from SAP Note 144441 into SAP System.
    2. Execute the program with transaction code SA38
    3. Check the values in test mode
    4. If the table is updated with documennt number , remove the test mode tick and execute again
    5. Execute the deprecation
    Edited by: Ramesh Reddy Nalamada on Jul 25, 2009 1:17 AM

  • How to import legacy data into apex tables

    Hi All,
    Please tell me How to import legacy data into apex tables.
    Thanks&Regards,
    Raghu

    SQL WorkshopUtilitiesData Workshop...
    you can import the data from already exported as (text/csv/xml) data
    Note: the table name and column name should be equal if the table already Existing table.

  • Net Book Value calculation Issue Manual Legacy Data Transfer

    Dear All
    I want to Upload balance of old fixed Asset in newly configure System that are managed manually in previous year, I want to Post Their previous Current Written Down Value and Current Accumulated Depreciation when i go via IMG Create Legacy Data Transfer and enter,
    Take Over Value            of assets  I entered for example
    Accusation Value  100,000
    Accumulated ordinary depreciation (Old depreciation in Previous Years) RS.10,000
    Net book Value Rs.110,000
    and system Calculate the Depreciation on 110,000 as WDV 10%  that is 11,000
    My requirement is that it will Calculate it as  100000-10000
    Depreciation should be calculated at a 90000 @ 10% WDV that is 9000
    Please suggest me, how i can do.
    or other path or T.code where i can do.
    i am new please explain in detail.
    Regard
    Khalid mahmood

    Dear Atif and AP
    Issue is not with Depreciation key it is working well according to my requirement but the issue is that base value that the system use is wrong.
    i want that the system calculate depreciation at   Cumulative Acquisition value (Purchase /capitalize/ cost price) less  Accumulated Depreciation
    is equals to Written Down Value.
    I want that system use WDV as a base in each year after deducting year depreciation  or  (Acqusition cost - Accumulated depreciation).
    I post a new asset and assign depreciation key it working well as required.
    But i going to post Old asset for example Rs.100000 in year 2006. its Accumulated depreciation is suppose Rs.30000 in Year end 2011.
    i want to post that asset in year 2012 as 100000-30000  and WDV IS Rs.70000 ; 
    i want that it will use 70000  as a base in the coming year to calculate depreciation.
    My response is 100000+30000 and use 130000 as base to calculate the ordinary depreciation for the next year.
    Now Please read the first thread again and guide me according to my scenario.
    or tell me how i post the old asset value and their depreciation.
    Regard
    Khalid Mahmood

  • Asset Legacy data transfer reconciliation with GL

    We have recently merged a deactivated company code with our operating company and transferred all Finance processes, open items,master data etc.
    We transferred assets using Legacy Data Transfer AB91 (we did not use intercompany asset transfer ABT1N).
    We then transferred the Trial Balance.
    The asset balances reconcile against the GL accounts for the transferred assets when executing S_ALR_87011964 and FS10N.
    However there is a difference in ABST2 which is the same difference as the migrated assets.
    I would expect there to be no difference between FI and AA for legacy data transfer
    I have also run ABST for all affected GL accounts and there is no difference when executing this report.
    Note that the company code migration occurred in January 2008, there is no effect on 2007 year end data.

    Hi
    You have to take over the GL  balance using OASV t.code
    regards
    Sibichan

Maybe you are looking for

  • Suppress AWT

    I have extended a awt class in my java application. Somehow, it pop up a GUI window. Is there any parameter I can pass to the JVM so it will ignore all the GUI bits? I just want to have a pure console java application. Many Thanks! Art.

  • Mail troubles

    Mail has been very unresponsive as of late. It just spins and it is loading plain messages very slow. I have to force quit several times a day. Is there any fix to keep mail running smooth?

  • In which scenarios Invoice Creation is done by Finance?

    Hi Everybode, As per my understanding, in Sales, the Invoice is generated by Sales Department. But In some cases the Invoice is generated by Finance also. Can someone tell me in which cases this is done? Thanks and Regards, Sameer

  • Import from tablespace export

    Dear Experts, Oracle 10g, Windows 2003. I have exported users tablespace by using following command : exp system/password file='D:\userstablespace.dmp' tablespace=users. Now I want to export all objects in users tablespace to a user or object of a pa

  • Combining PDFs

    How do I combine several PDFs into one PDF document?