SAPFICO versus Legacy financials

Hi All,
What is the advantage of implementing SAP FICO over any legacy financial system.
what are the business scenarios which would benefit from SAP FICO implementation.
I would appreciate if any body provide a brief explanation with one or teo scenarios which justifies the changeover from legacy to SAP FICO
Thanks
Dev

Hi,
Agree with Dave. System integration is one of the big advantages.
SAP FICO will provide you with the most comprehensive global financial management solution available today. With SAP FICO, you can:
<b>Improved corporate performance</b>
It gives you the ability to quickly read, evaluate, and respond to changing business conditions with effective business strategies.
<b>Faster closes</b>
You can streamline consolidation, process scheduling, workflow, and collaboration.
<b>More effective corporate governance and better transparency</b>
It provides broader support of accounting standards, federal regulations, and improved administration of internal controls.
<b>Shorter days sales outstanding</b>
It automates dispute, credit, and collections management -- and offers electronic invoicing and payment capabilities that supplement traditional accounts receivable and accounts payable functions.
<b>Greater ability to manage cash globally</b>
It lets you report, analyze, and allocate cash in real time, and establish in-house banks or payment centers.
<b>Improved financial and managerial reporting</b>
It gives you the flexibility to report performance by business unit, organization, or cost center.
<b>Improved process integration between finance and treasury</b> You can integrate risk and treasury transactions with core accounting and financial reporting processes.
<b>More competitive costs of finance</b>
It helps you innovate processes, collaborate with supply chain partners, and establish global shared-service operations.
In addition, SAP FICO also addresses a broad range of compliance and governance issues, including:
- Segregation of duties analysis
- Sarbanes-Oxley compliance
- IFRS compliance
- Basel II compliance
Hope this will help.
Regards,
Ferry Lianto

Similar Messages

  • Clean install from a Lenovo installa. dvd: UEFI vs Legacy + Thinkpad setup acc. to Lenovo user Guide

    Machine: T520 bought Feb. 2012
    O.S. W~7 Pro 64-bit
    I want to wipe the harddrive, buy a Lenovo W~7 PRO installation DVD, and do a completely clean installation.
    Lenovo User Guide (.pdf) Ch. 8, Adv'd Config, Installing Windows 7, p. 295 (212 of 298) says:
    Start Thinkpad setup, then choose UEFI versus Legacy, then insert the installation DVD.
    This obviously does not apply when the harddrive has already been wiped (necessary to 100% eliminate all traces of irreversible damage caused by a virus, including the re-installation partition (Q-drive).
    When an intallation DVD is purchased from Lenovo, is not the first step to insert the DVD ?!! Then presumably there is an installation wizard resulting in a restored O.S. ?!!
    If not, please tell me what surprises are in store, and where I can find a reliable written resource to consult !!!
    Thank-you.
    Solved!
    Go to Solution.

    You're over-thinking this and making it harder than it needs to be.  The manual isn't helping...
    The BIOS is stored in non-volatile memory on the motherboard.  One would probably call it firmware, not software.  It is there regardless of what's on the HDD - or whether there's even an HDD installed.
    Turn your machine off.  Turn it on again.  When you see the logo screen, hit F1 - multiple times if necessary.  That gets you into the motherboard's BIOS configuration utility - what the manual is calling "Setup".
    Here's a snip below from the T520 BIOS configuration utility (actually from the on-line BIOS simulator).  In the StartUp tab you can see where UEFI/Legacy can be selected.  You can configure UEFI boot there.  Or "both".  Or - probably - just select Setup Defaults, save, and exit.
    Z.
    The large print: please read the Community Participation Rules before posting. Include as much information as possible: model, machine type, operating system, and a descriptive subject line. Do not include personal information: serial number, telephone number, email address, etc.  The fine print: I do not work for, nor do I speak for Lenovo. Unsolicited private messages will be ignored. ... GeezBlog
    English Community   Deutsche Community   Comunidad en Español   Русскоязычное Сообщество

  • Differences between Hyperion consolidation and Oracle Financials Consolidat

    I'm looking for a document- recommendations etc. explaining why using Hyperion Consolidations versus Oracle Financials Consolidation ?
    Thanks
    Marcel

    Genie,
    I agree that a case is made on how well these two ERPs solve the day to day tasks before a company chooses one of these two. But my question is aimed at finance to begin with. You are going to need a general ledger for any company or government regardless of ERP. And how well you can drive the ledger to map your business is the key here.
    The way ledger is built is around a business area ,company or country and is very tighly designed in SAP. I would like to know if there is any equivalent of the configuration items in ORACLE. I am more interested in terminology of ORACLE.
    For example ,
    <u><b><u><b>SAP ====> ORACLE</b></u>
    <u><b>GL ==> Book In ORACLE</b></u>
    <b><u>Document Number ===> Invoice Number</u></b>
    <b><u>Posting Period in SAP ==> Posting period In ORACLE</b></u></u></b>
    Most of the terms are finance terms , so they are common ( eg. an account number is an account number in ORACLE and SAP ) between two systems. I am interested in any specific thing that is available in ORACLE but not in SAP and  vice verse.
    thanks for the link you have provided.
    Its very useful.

  • MSI 280 R9 possible Dead of arrival?

    Chief complaint: No video signal from DVI/HDMI from card. After installing catalyst, windows will blue screen at login until MSI R9 280 card is taken out of the PCI slot
    New build with MSI R9 280, gigabyte GAB85M-D3H mobo (compatible with my card),  PSU XFX TS 550W Gold, core i5-4570, and windows 8.1.
    Onboard intel integrated graphics outputs works perfectly fine. The MSI card when installed the fan runs, and if I change the bios to direct video to the intel graphics, I can use the onborad HDMI slot for video and still boot up. Catalyst installer then is able to detect my card and be installed. At this point I plugged in my hdmi to video card before the reboot, still no video. After I rebooted, I still had no video (even after switching bios settings) and switched HDMI back to intel gfx. Now when I boot windows up, I get a bluescreen error.
    I suspect I have both a hardware issue with the MSI card and possible driver issues that locked windows out of loading completely.
    I have changed the gigabyte mobo BIOS around to try out different settings without any luck (changed compatibility CSM on/off, prioritized legacy or UEFI and vice versa, disabled fast boot)
    Please if anyone has any thoughts on what to do I VERY MUCH APPRECIATE YOUR EXPERTISE. I want to see if I can make this work without RMA'ing it to newegg or MSI.
    Michael
    **also what is the advnatage of flashing your vbios to UEFI rather than legacy OUTSIDE of the protection from outside attacks?  I don't fully understand this as i've never used windows 8 before. I'm trying to read but its a bit tough.

    ok so it could be one of 2 things
    1: your VBIOS being used on the card is wrong (it has a 2 place switch near the Crossfire conectors) 1 is a GOP one for windows 8/8.1 and the other is legacy so you could try the other position (move it with the system shut down!)!
    2: your Motherboards BIOS so see if Gigabyte has a updated one as usually with newer cards you need to update that to get them to work correctly so that's worth a try!
    as for GOP/UEFI VBIOS versus Legacy a GOP/UEFI one is used for windows 8 for fast boot if not it may hold back loading the computer to start if its a legacy one! Legacy VBIOS is usually the one supplied on cards as it may be put onto a motherboard that may not support that type if its GOP!
    let me know if ether of the 2 suggestions above work!

  • VoIP legacy versus AVVID VOIP

    Ben,
    In the present time I'm preparing to give a presentation on the different between Voip over legacy PBX (Nortel Opt-11) and AVVID CCM VOIP topology. I'd worked alot on VoIP and Vofr via legacy PBX and a fair knowledge on AVVID. I need your expert advise or point me to some documentation to research.
    Thanks,
    Allen L.
    Network Engineer III

    Here are some documentation you have asked for:
    http://www.cisco.com/en/US/products/sw/custcosw/ps1001/products_white_paper09186a0080145588.shtml
    http://www.cisco.com/en/US/about/ac123/ac114/ac173/ac222/about_cisco_packet_small_and_midsized_business09186a0080158b0f.html

  • Integration of Financials with external systems

    Hi,
    I am strugling with an implementation where the client is not sure if Oracle Financials suits his business processes
    Overview of situation on Hand:
    We do not have the product installed as yet at the client site
    Products of Oracle Financials to be used :General Ledger, Account Receivable, Account Payable, Fixed Assets,India Localization Patch
    Products of Oracle Applications NOT available for use :
    Purchasing, Inventory, Order Management
    (All these areas are being covered by developing a customized Bespoke system)
    1.     Is it possible to use Oracle India Localizations (with regards to the excise functionality, for e.g. claiming of ModVat, the various excise registers that are to be maintained for e.g. RG23 A, RG23C,etc ) in the above situation (without implementing Purchasing, Inventory, and Order Management?).
    2.     Further, while passing the Vendor’s Bills (in Oracle Payables), one of the criteria for PO Matching is to check if the ModVat has been claimed. Is this functionality available in Payables with the India localization patch?
    3.     Does Oracle India localization cater for VAT requirements?
    4.     Is an Open Interface available to transfer Purchase Order data from external systems to Oracle Purchasing tables (which are shared by Oracle Payables the names being PO_HEADERS, PO_LINES, PO_LINES_LOCATIONS, PO_DISTRIBUTIONS, PO_DISTRIBUTIONS_AP_V (VIEW OF PO_DISTRIBUTIONS), PO_RELEASES (Blanket Purchase Orders), PO_LOOKUP_CODES) at transactional frequency? However, if the oracle purchasing module is not being used, can the interface tables of Purchasing be used?
    5.     An open interface (Payables Open Interface) is available in Oracle Payables to import the Invoices from external systems. While importing these invoices, does the system expect to have the Purchase Order data in the PO tables mentioned in the point above?
    6.     Is an Open Interface available to transfer Quantity Received/Accepted data from external systems to PO_line_locations table to enable carrying out of 2/3/4 way matching of Purchase orders with invoices? Can the 4 way mathcing be carried out in AP by just importing Purchase Order data??
    7.     Can the Credit Card Transaction Interface be used for uploading employee expenses / advances settlement (not carried out via credit card) directly from feeder system?
    8.     Is it possible to use Open Item interface (including import concurrent program) even though Inventory module is not being installed ? If yes, then we would like to use this interface for updating Item master from bespoke system..
    9.     Can Auto Invoice API be used to import invoices from feeder system / legacy system (via RA interface Tables) into the Oracle receivable invoice tables? Is order number as column a prerequisite for successful completion of Auto Invoice API?
    10. Where should the Masters be kept....OF of Bespoke
    eg. Employee master, Inventory Item Master etc.
    11. What is the best strategy for keeping the data in Bepoke and OF related to Masters in sync?
    I have got various answers to these questions .....but some seem to contradict each other.
    PLEASE HELP!!
    Thanks,
    Kamana

    Dear Kamana,
    Can you send me the replies given by our other Forum Friends, let me analyze the entire stuff and get back to you with a single consolidated bible for all your questions.
    Gopal

  • ASCII character/string processing and performance - char[] versus String?

    Hello everyone
    I am relative novice to Java, I have procedural C programming background.
    I am reading many very large (many GB) comma/double-quote separated ASCII CSV text files and performing various kinds of pre-processing on them, prior to loading into the database.
    I am using Java7 (the latest) and using NIO.2.
    The IO performance is fine.
    My question is regarding performance of using char[i] arrays versus Strings and StringBuilder classes using charAt() methods.
    I read a file, one line/record at a time and then I process it. The regex is not an option (too slow and can not handle all cases I need to cover).
    I noticed that accessing a single character of a given String (or StringBuilder too) class using String.charAt(i) methods is several times (5 times+?) slower than referring to a char of an array with index.
    My question: is this correct observation re charAt() versus char[i] performance difference or am I doing something wrong in case of a String class?
    What is the best way (performance) to process character strings inside Java if I need to process them one character at a time ?
    Is there another approach that I should consider?
    Many thanks in advance

    >
    Once I took that String.length() method out of the 'for loop' and used integer length local variable, as you have in your code, the performance is very close between array of char and String charAt() approaches.
    >
    You are still worrying about something that is irrevelant in the greater scheme of things.
    It doesn't matter how fast the CPU processing of the data is if it is faster than you can write the data to the sink. The process is:
    1. read data into memory
    2. manipulate that data
    3. write data to a sink (database, file, network)
    The reading and writing of the data are going to be tens of thousands of times slower than any CPU you will be using. That read/write part of the process is the limiting factor of your throughput; not the CPU manipulation of step #2.
    Step #2 can only go as fast as steps #1 and #3 permit.
    Like I said above:
    >
    The best 'file to database' performance you could hope to achieve would be loading simple, 'known to be clean', record of a file into ONE table column defined, perhaps, as VARCHAR2(1000); that is, with NO processing of the record at all to determine column boundaries.
    That performance would be the standard you would measure all others against and would typically be in the hundreds of thousands or millions of records per minute.
    What you would find is that you can perform one heck of a lot of processing on each record without slowing that 'read and load' process down at all.
    >
    Regardless of the sink (DB, file, network) when you are designing data transport services you need to identify the 'slowest' parts. Those are the 'weak links' in the data chain. Once you have identified and tuned those parts the performance of any other step merely needs to be 'slightly' better to avoid becoming a bottleneck.
    That CPU part for step #2 is only rarely, if every the problem. Don't even consider it for specialized tuning until you demonstrate that it is needed.
    Besides, if your code is properly designed and modularized you should be able to 'plug n play' different parse and transform components after the framework is complete and in the performance test stage.
    >
    The only thing that is fixed is that all input files are ASCII (not Unicode) characters in range of 'space' to '~' (decimal 32-126) or common control characters like CR,LF,etc.
    >
    Then you could use byte arrays and byte processing to determine the record boundaries even if you then use String processing for the rest of the manipulation.
    That is what my framework does. You define the character set of the file and a 'set' of allowable record delimiters as Strings in that character set. There can be multiple possible record delimiters and each one can be multi-character (e.g. you can use 'XyZ' if you want.
    The delimiter set is converted to byte arrays and the file is read using RandomAccessFile and double-buffering and a multiple mark/reset functionality. The buffers are then searched for one of the delimiter byte arrays and the location of the delimiter is saved. The resulting byte array is then saved as a 'physical record'.
    Those 'physical records' are then processed to create 'logical records'. The distinction is due to possible embedded record delimiters as you mentioned. One logical record might appear as two physical records if a field has an embedded record delimiter. That is resolved easily since each logical record in the file MUST have the same number of fields.
    So a record with an embedded delimiter will have few fields than required meaning it needs to be combined with one, or more of the following records.
    >
    My files have no metadata, some are comma delimited and some comma and double quote delimited together, to protect the embedded commas inside columns.
    >
    I didn't mean the files themselves needed to contain metadata. I just meant that YOU need to know what metadata to use. For example you need to know that there should ultimately be 10 fields for each record. The file itself may have fewer physical fields due to TRAILING NULLCOS whereby all consecutive NULL fields at the of a record do not need to be present.
    >
    The number of columns in a file is variable and each line in any one file can have a different number of columns. Ragged columns.
    There may be repeated null columns in any like ,,, or "","","" or any combination of the above.
    There may also be spaces between delimiters.
    The files may be UNIX/Linux terminated or Windows Server terminated (CR/LF or CR or LF).
    >
    All of those are basic requirements and none of them present any real issue or problem.
    >
    To make it even harder, there may be embedded LF characters inside the double quoted columns too, which need to be caught and weeded out.
    >
    That only makes it 'harder' in the sense that virtually NONE of the standard software available for processing delimited files take that into account. There have been some attempts (you can find them on the net) for using various 'escaping' techniques to escape those characters where they occur but none of them ever caught on and I have never found any in widespread use.
    The main reason for that is that the software used to create the files to begin with isn't written to ADD the escape characters but is written on the assumption that they won't be needed.
    That read/write for 'escaped' files has to be done in pairs. You need a writer that can write escapes and a matching reader to read them.
    Even the latest version of Informatica and DataStage cannot export a simple one column table that contains an embedded record delimiter and read it back properly. Those tools simply have NO functionality to let you even TRY to detect that embedded delimiters exist let alone do any about it by escaping those characters. I gave up back in the '90s trying to convince the Informatica folk to add that functionality to their tool. It would be simple to do.
    >
    Some numeric columns will also need processing to handle currency signs and numeric formats that are not valid for the database inpu.
    It does not feel like a job for RegEx (I want to be able to maintain the code and complex Regex is often 'write-only' code that a 9200bpm modem would be proud of!) and I don't think PL/SQL will be any faster or easier than Java for this sort of character based work.
    >
    Actually for 'validating' that a string of characters conforms (or not) to a particular format is an excellent application of regular expressions. Though, as you suggest, the actual parsing of a valid string to extract the data is not well-suited for RegEx. That is more appropriate for a custom format class that implements the proper business rules.
    You are correct that PL/SQL is NOT the language to use for such string parsing. However, Oracle does support Java stored procedures so that could be done in the database. I would only recommend pursuing that approach if you were already needing to perform some substantial data validation or processing the DB to begin with.
    >
    I have no control over format of the incoming files, they are coming from all sorts of legacy systems, many from IBM mainframes or AS/400 series, for example. Others from Solaris and Windows.
    >
    Not a problem. You just need to know what the format is so you can parse it properly.
    >
    Some files will be small, some many GB in size.
    >
    Not really relevant except as it relates to the need to SINK the data at some point. The larger the amount of SOURCE data the sooner you need to SINK it to make room for the rest.
    Unfortunately, the very nature of delimited data with varying record lengths and possible embedded delimiters means that you can't really chunk the file to support parallel read operations effectively.
    You need to focus on designing the proper architecture to create a modular framework of readers, writers, parsers, formatters, etc. Your concern with details about String versus Array are way premature at best.
    My framework has been doing what you are proposing and has been in use for over 20 years by three different major nternational clients. I have never had any issues with the level of detail you have asked about in this thread.
    Throughout is limited by the performance of the SOURCE and the SINK. The processing in-between has NEVER been an issu.
    A modular framework allows you to fine-tune or even replace a component at any time with just 'plug n play'. That is what Interfaces are all about. Any code you write for a parser should be based on an interface contract. That allows you to write the initial code using the simplest possible method and then later if, and ONLY if, that particular module becomes a bottlenect, replace that module with one that is more performant.
    Your intital code should ONLY use standard well-established constructs until there is a demonstrated need for something else. For your use case that means String processing, not byte arrays (except for detecting record boundaries).

  • Oracle GL versus Hyperian.

    Need any and all the information I can Get to understand the difference and the advantages of using Oracle versus Hyperian.
    (Oracle Financials R11 GL consolidations versus Hyperian Reports).
    null

    H John,
    Thanks for the prompt reply.
    I am using Hyperion 9.3.1 with Oracle 10g. This is the first time, i am trying to do the integration with Oracle GL. My client dont have licenses for DIM, they are not ready to buy the licenses.
    There is anyway to load the essbase cubes through Oracle tables.
    John, This is my idea(Can u please suggest me on this). I am doing this in a test environment.
    1) From Oralce GL server, I am taking the Trial Balance values into flat files
    2) I am creating one Oracle table for TB_ORAGL(In Planning Server)
    3) In Planning server, i am importing the flat files(TB_ORAGL) to the Oracle
    4) From oracle table(TB_ORAGL), i am trying importing the data to essbase cubes using data load.
    Vice versa steps for Hyperion Planning to Oracle GL.
    can u guide me in this situtation.
    Thanks,
    PC
    Edited by: dwhpc on Oct 21, 2009 1:55 PM

  • Disco 10g with Apps 11.5.9. Financials Intelligence

    Hello,
    I'm reading through Metalink note "Using Discoverer 10.1.2 with Oracle E-Business Suite 11i" 313418.1. with a view of getting Apps 11.5.9 integrated with Disco for both Financials Intelligence and also custom business areas and folders created against the base Apps views and tables.
    I have some queries relating to SSO and step 1.6 ( Implement Single Sign-On for Discoverer 10.1.2.0.2 (Optional) )
    I want to understand better when this needs to be put in place. I essentially need to be able to use the Financials Intelligence responsibility in Apps so that when I click on one of the reports, say "Project Cost Analysis", this would launch Disco 10g directly without further sign on requests, or choosing a connection from the Discoverer Connections screen.
    I would also want to launch additional custom reports based on the Apps 11.5.9 datamodel via Disco Plus/Viewer standalone, preferably by the Apps 11.5.9 menu. Can this be done, or do I need to launch the Discoverer plus/viewer URL directly to access the connections screen (presuming the default connections screen is mandatory)?
    How does Oracle Apps work with the connections screen that is provided with Disco 10g so that it does not appear when a report is launched? Perhaps this is where the option step 1.6 assists with full SSO integration with the single signon server, so that it uses your Apps login to work out what Disco connection you have available to you.
    We are not necessarily looking at using Oracle Portal, or requiring additional SSO integration with OID, if Disco can be fully integrated with Apps from a sign on perspective via other means.
    I guess that if we can only launch custom disco worksheets from the connections screen (http://host:port/discoverer/viewer) which has a connection already registered for the user's connection using URL security (as you wouldn't want a financials reports made public) then we may need to consider SSO integration as described in step 1.6 if it allows us to use the same login details as used for Apps sign on.
    Any pointers would be greatly appreciated.
    Cheers,
    John

    Thanks for the reply.
    I've looked at Section 6 - not sure which part you are specifically referring to as paragraph 3, but I am guessing you mean the heading titled "Verify Applications profile options in Oracle Applications".
    I have a couple of further queries.
    1. How do you integrate a new custom Disco worksheet into Apps 11i using form function security? I'm sure there is a current document on this somewhere. I found metalink note 278095.1 "How to Create a Link to a Discoverer Workbook in Apps11i", but this said that the author could not get this to work in 10g. Is there a document that is relevant to 10.1.2?
    2. Section 1.2 in the original note I was referring to states:
    If you are not planning to use features Discoverer Connection Management, Discoverer Portlet Provider, and Oracle SSO, you can choose to alternatively install Oracle Business Intelligence Server 10g Release 2 (10.1.2.0.2). This installation type does not require installation and association to OracleAS Identity Management Infrastructure 10g (10.1.4.0.1).
    Does this not require an infrastructure tier at all? Does anyone have a URL that points to the documentation/product download page for this software? I can find the standard Apps Server 10g page. I am wondering whether this significantly reduces the time to get Disco installed and integrated with Apps 11i versus going down a full install, as the metalink note contains a large number of steps.
    Cheers,
    John

  • Legacy uploads

    The current company that i work for has sap but we are acuiring a new company which does not have sap. They are wanting for the time being financials be put on sap. The PO and GR will be done in legacy system and every day IR will be created in sap. how is the three way match going to work in  this case?how do i handle taxes? also generally we upload vendor  line items one side hitting the vendor and other a conversion account, in thhis case since it is a daily interface what account needs to be hit. any help is appreciated?

    That being said..now I see that these asset loads were done with a transfer date as the end of the last closed fiscal year.
    I was confused about the mid year transfer all this time.
    So based on the below it does say that
    'In this case, you do not need to include any posted depreciation or transactions in the transfer of legacy data. You only need to transfer master data and the cumulative values as of the end of the last closed fiscal year.'
    http://help.sap.com/saphelp_erp60/helpdata/en/4f/71fd71448011d189f00000e81ddfac/frameset.htm
    So i guess, there will be no entries in the transactions area of the asset(AW01N) and this is correct when assets are loaded as of the end of last closed fiscal year.
    can you confirm?

  • Shutting down versus always on

    We are having a "discussion" as to the benefits of allowing the machine to shut down at night (if it is not actively being used) versus always leaving it on.
    One side thinks that shutting down reduces wear and tear on the fans and allows everything to cool down until needed.
    The other side contends that the act of powering up is detrimental to the machine due to the sudden surge of power etc.
    Both sides agree on using the sleep mode when feasible.
    Opinions aside, has Apple posted a position on this?
    Thanks.

    If it is on a UPS (no matter which way you go), the inrush current happens when coming out of sleep so no help there, 50,000 cycles for hard drive spin up/down lifecycle seems more than adequate for how long I use them before retiring to backup duty.
    Boot up is usually under a minute but getting back to where you were, that is the main reason.
    If Mac OS and Mac Pro would use a disk image of memory - hibernation file - and if everything is backed up, and maintained (HFS file system, journal, directory don't take kindly to outages).
    Weigh the risks and benefits. In the past it took a big old iron system a long time to boot, it was very hard on components and drives, and SCSI and legacy drives were run 24/7. the heat to cold and back, the change in temperature, is what is hard on most.
    some have measured sleep, deep sleep, off, versus on to see just how much or little difference it might make. then the number of systems.
    the biggest cost can be cooling a room from the heat generated when running and active, but sleeping no.
    A system's fans kick into high on wake from sleep or power on to dispel heat build up while sitting. FBDIMMs (2008 and before) are very hot and helped create so much it can be hot out the back. Now, DDR3 should be cooler. But any dual Xeon 5500 nehalem or last year 5400 etc take a lot to cool and can run warm.
    Bottom line: I don't think it much matters one way other than work environment, schedule, and people.
    If you have a lot of external storage arrays, that would probably tip my own hat to shutting those down if idle for more than 6 hrs.

  • Netbeans versus Eclipse for JNI

    We have a significant about of code to rehost on solaris 8, from xview, to new java guis interfacing legacy C code. Current develpment environment is vi and make. We want to evolve to Junit, Ant, CVS and eand IDE either Netbeans or Eclipse (maybe JbuilderX). Have used both for java but never for JNI. Now exploring Eclipse. Can anyone lead us to a forum on Netbeans versus Eclipse for this type of effort?

    Hi Ivar, thanks for your reply.
    Actually both Netbeans and Eclipse provide a C/C++ plug-in, so you can develop the C side as well. I'm using it in Eclipse. There is a C Ant task also, so you can build the C and Java together. (Ideally, we want to be able to step, in the source level debugger, from the java into the C native method, but I don't believe that has been worked out in any IDE.) I am working with Eclipse now, and you can edit java in the java perspective, click a C file, and automatically switch to the C perspective. That seems to work pretty well. On the other hand, Netbeans is a Sun sponsored project, and we are on Solaris, so maybe Netbeans has some platform dependent goodies that might tip the balance.
    You and I both know that while a tool may have a capability, that does not mean it does the job well. Somebody else must be doing the same thing that I am, so I'm looking to learn from the experience of others who are farther into the process. Maybe I can avoid having to get deeply into one tool, only to find I should have used another.

  • Migrate all Open Sales Orders From Legacy System (SAP) To SAP System using

    Hi Experts,
                 I've to Migrate all Open Sales Orders From Legacy System (SAP) To SAP System using Business Objects with a new SALES ORDER DOCUMENT NUMBER referencing the older one.
               I'll get all the required data with field in an excel file.
                 Does any standard transaction exist for it ? Or how to go ahead with it ?
    Thanks and regards,
    Jyoti Shankar

    Hi
    If you are checking for CREATE option then Sales Doc Type
    For more Info goto SWO1 transaction -> BUS2032 --> DIsplay --> Execute --> There SELECT the method which you want to perform... There you can fine the MANDATORY parameters also....
    Or in DISPLAY mode PLACE Cursor on the Required Method and CLick the PARAMETERS button on toolbar...
    That will show the MANDATORY parameters...
    Reward if helpful....
    Message was edited by:
            Enter the Dragon

  • Uploading data from a view in legacy system to SAP

    Hi,
    I am developing a custom table in sap. The data in this table will be loaded from a view which exists in the clients legacy system (Oracle db).There will be no middleware for the data transfer.
    How can this be done in SAP? Can anybody provide a detailed procedure to do this?
    Thanks in advance !!!!

    Hi
    Is the Legacy system capable of calling 'BAPI' or 'RFC'.
    If yes, than you can create a RFC function module and with in the function module, you can write code to populate values to ZTABLE.
    When the Legacy system calls the RFC with values, your Ztable will be updated
    Regards
    Madhan

  • Steps to prepare and upload legacy master data excel files into SAP?

    Hi abap experts,
    We have brand new installed ECC system somehow configured but with no master or transaction data loaded .It is new empty system....We also have some legacy data in excel files...We want to start loading some data into the SAP sandbox step by step and to see how they work...test some transactions see if the loaded data are good etc initial tests.
    Few questions here are raised:
    -Can someone tell me what is the process of loading this data into SAP system?
    -Should this excel file must me reworked prepared somehow(fields, columns etc) in order to be ready for upload to SAP??
    -Users asked me how to prepared their legacy excel files so they can be ready in SAP format for upload.?Is this an abaper job or it is a functional guy job?
    -Or should the excel files be converted to .txt files and then imported to SAP?Does it really make some difference if files are in excel or .txt format?
    -Should the Abaper determine the structure of those excel file(to be ready for upload ) and if yes, what are the technical rules here ?
    -What tools should be used for this initial data loads? CATT , Lsmw , batch input or something else?
    -At which point we should test the data?I guess after the initial load?
    -What tools are used in all steps before...
    -If someone can provide me with step by step scenario or guide of loading some kind of initial master data - from .xls file alignment to the real upload - this will be great..
    You can email me some upload guide or some excel/txt file examples and screenshots documents to excersize....
    Your help is appreciated it.!
    Jon

    hi,
    excel sheet uploading:
    http://www.sap-img.com/abap/upload-direct-excel.htm
    http://www.sap-img.com/abap/excel_upload_alternative-kcd-excel-ole-to-int-convert.htm
    http://www.sapdevelopment.co.uk/file/file_upexcel.htm
    http://www.sapdevelopment.co.uk/ms/mshome.htm

Maybe you are looking for

  • How to View by Collection AND Tag - Newbie Needs Good Documentation

    I am a self-study student trying to decide whether upgrading to Elements (or better) is worth it. I am comparing Adobe and Microsoft products. In Album Starter 3.2, all I want to do is one simple task: View pictures by Collection AND category. (I've

  • How do I change the number in iMessage on my iPad?

    I already have a number on it but it isn't the right one

  • Convert currency to character in smartforms.

    Hi Friends, I need one solution. I have 5 currency fields and i wanna print these currencies fields into character format in SMARTFORMS. How can we do it. Plz reply me as soon as possible. Thanks & Regards, Swapnika

  • Variance in Activity

    I am currently working on an implementation project in a complex make to order scenario in which I am using valuated sales order stock. For product costing perspective, I have created thee activity types FOH, Material and labour for product costing a

  • Zoom with ROI tools not working

    Hi I do have a question regarding the zooming with the ROI-tools. If I aquire (Vision Aquisition2) automatically stored images I would like to change the zoom factor interactively. If I am using the ROI tools of the displayed image the ROI-tools are