Performance and Server questions

Can anyone tell me how the Performance of Dev 6 really is over
the internet? We would really like to have our application run
over the web. Also what would be recommended OAS server
requirements - memory etc to have acceptable speed. Has anyone
had any problems with this? Has anyone successfully deployed
their application with the 3 teir architecture? Are there any
white papers that address this that I can look at?
Thanks.
null

Can anyone tell me how the Performance of Dev 6 really is over
the internet? We would really like to have our application run
over the web. Also what would be recommended OAS server
requirements - memory etc to have acceptable speed. Has anyone
had any problems with this? Has anyone successfully deployed
their application with the 3 teir architecture? Are there any
white papers that address this that I can look at?
Thanks.
null

Similar Messages

  • Groupwise 8 on Server 2008R2 SP1 Performance and Defrag Questions

    I am running GroupWise 8.0.2 on an HP DL360G G7 with 24 GB of RAM and Dual Xeon X5650 processors under Server 2008 R2 Sp1. Post Office is using roughly 562 GB of an 819 GB Disk located on an HP p2000 g3 SAS connected enclosure comprised of 12 x 146 GB (15k rpm) SAS disks in a RAID 10 configuration. Typically never more than 700 users connected to the Post Office at any time. I am experiencing client slowness at regular intervals. Roughly every three weeks. The solution tends to be to restart the server and give it a fresh start, which holds for a while.
    Concerns.
    1. When the server restarts I see minimal memory utilization maybe 1.5 GB within half an hour the Used memory climbs a bit to about 2GB but free memory is quickly pre-allocated to standby memory. Within about 2 hours the Free memory is all consumed and I have about 20+ GB of Standby Memory. Running RAMMap indicates that the memory is being used by Metafile and Mapped File, Which tells me that the Post Offices files are being indexed and cached into RAM. Then after a couple weeks go by the amount of active RAM exceeds 8GB (still mostly Metafile, not so much Mapped File). And standby RAM still consumes the remaining RAM between Metafile and Mapped file, leaving no free memory. Typically once I reach about 8GB of memory actively used by the system (mostly Metafile), it's time for performance to drop for the clients.
    Also, I'm seeing increases in Disk Queue Length for the Post Office Volume. Typically below 2 rarely as high as 5,
    I suspect my best solution is to start a regular defrag process as my volume is now 29% defragmented [yeah NTFS :( ]
    Question:
    I am concerned a defrag could take as much if not longer than 10 hours. Which is too long for the agents to be down. So I was wondering if anyone had used DiskKeeper or alternative 3rd party Defrag utilities that might be able to defrag open files in the background, or if anyone had run defrag with the agents running to get the defragable files, then shut down the agents for a second pass which should be considerably shorter. Any advice that can be offered, or other suggestions for my described issue, would be greatly appreciated.
    Thank You!

    In article <[email protected]>, Matt Karwowski wrote:
    > I am running GroupWise 8.0.2 on an HP DL360G G7 with 24 GB of RAM and Dual Xeon X5650
    > processors under Server 2008 R2 Sp1. Post Office is using roughly 562 GB of an 819
    > GB Disk ... I am experiencing client slowness at regular intervals. Roughly every
    > three weeks. The solution tends to be to restart the server and give it a fresh start,
    A) Updating to latest code may assist as this could also be a memory fragmentation type
    issue or such that has been fixed
    B) Perhaps even more RAM might help. How much space do the ofuser/*.db and ofmsg/*.db files
    take up? The more mail flows, the more of those DB file content are held in memory. A few
    versions ago, you tended to need as much RAM as the total of the DB files, and while those
    days are gladly past, it is still a good number to watch out for.
    C) Explore Caching mode for at least some of the users as that significantly reduces the
    load on a server.
    > I suspect my best solution is to start a regular defrag process as my volume is now
    > 29% defragmented [yeah NTFS :( ]
    Still way better than old FAT ;)
    This is one of the reasons why GroupWise runs better on Linux with either EXT3 or NSS
    volume types, and is why SLES is provided along with your GroupWise license.
    As for running Defragments, even running it for just a few hours a week will gradually
    help, especially if you can fit in one big burst near the beginning. So if you can automate
    the process of shut down agents, run defrag for X hours and then shut it down, then restart
    the agents, you may clear this all up. Just having the agents down an hour a week might
    clear the memory usage issue for you.
    I would be very hesitant to run any defrag on open database files of any email system
    unless the defrag tool knew explicitly about that email system. But a smart defragger that
    can keep the *.db files in their own (fastest) section of the drive and the rest off to the
    side would go a long way to making the defragmentation process much more efficient.
    I haven't directly run a GroupWise system on any flavor of Windows, since OS/2, so this is
    more a combination of all my platform knowledge, but I hope it gets you closer to a smooth
    running system. And if the other GW on Windows admins can pipe in, all the better.
    Andy Konecny
    KonecnyConsulting.ca in Toronto
    Andy's Profile: http://forums.novell.com/member.php?userid=75037

  • Infoset - Performance and Design Question .

    Hello BW Gurus,
    I have a question regarding the infoset.
    We have a backlog report designed as follows:
    sd_c03 - with around 125 chracteristics and kf's - daily load will have around 10 K records at a time.
    custome cube - say - zcust01 with around 50 characters and KF's.- daily full load aroun 15 K records.
    My question:
    1.We have infoset ontop of this 2 cubes for reporting. Can we use infoset for reporting .I used infoset for master data and DSO reporting. Do you guys see any performance issue with using the infoset instead of multiprovider. Is there any alternative instead of reporting from infoset.
    2.Also I executed the  SAP_INFOCUBE_DESIGNS program for the above cube and some dimesions are more then 25% like 58%, 75%,even 102%. So this has to be fixed for sure. Is it correct.if we don't change the design then what will be the consequences.
    Please advise. We are in the development and the objects are not yet moved to production yet.Again thanks for your help in advance.
    Senthil

    Hi......
    1.We have infoset ontop of this 2 cubes for reporting. Can we use infoset for reporting .I used infoset for master data and DSO reporting. Do you guys see any performance issue with using the infoset instead of multiprovider. Is there any alternative instead of reporting from infoset
    Multiproviders are a great tool that can be used freely to "combine" data without adding much overhead to the system. You may need to combine data of different applications (like Plan and Actual). Another good use is from a data modeling point of view...you can enable "partitioning" of the data by using separate cubes for separate years of data. This way the cube are more manageable, and with an InfoProvider on top the data can be combined for reports if reqd (instead of having one huge cube with all the data).
    Also using a multiprovider as an 'umbrella' infoprovider, you can easily make changes to underlying InfoProviders with minimum imapct to the queries.
    You should go for multi-provider when the requirement is to report on combined data coming from different infoproviders.
    Multi-providers combines data using union operation, that means all values of underlying infoproviders are taken into consideration. On ther other hand, Infoset works on the principle of join. Join forms the intersection and only the common data from underlying infoprovider are considerd..............In terms of performance infoset is better since it fetch less number of records.........
    Check this .........
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f5aa43f-0c01-0010-a990-9641d3d4eef7
    2.Also I executed the SAP_INFOCUBE_DESIGNS program for the above cube and some dimesions are more then 25% like 58%, 75%,even 102%. So this has to be fixed for sure. Is it correct.if we don't change the design then what will be the consequences
    If Dimensions are more it effects query performance.........its better to change the design........
    Hope this helps........
    Regards,
    Debjani.........

  • Performance and Compatability questions

    Hi everybody,
    I'm hoping no one minds me asking these basic questions.
    These past few years I've been using my a linksys WRT54GS (I'm pretty sure that's it) and it has been cutting out almost non-stop as of recently, almost every 15 minutes or less (which tends to be annoying). So I decided to start looking into new routers, and I thought that perhaps the Airport Extreme might be a good alternative.
    My first question is, my family is mixed between MACs and PCs. I was wondering if anyone has had any experience with using both systems on the router, and if they have had any major problems because of it (cutting out, or not getting signals at all)? But the other thing is, I'm usually in the basement, with the linksys on the first floor, which comes to roughly 70 feet away.
    My next question is, that the MACs in my house are first gen. Intel based MACs, so I don't believe any of them have wireless-n support (I think you need Intel Core 2 Duo Processor, which ours are not), will that cause any problem with the wireless links, I had read that routers with wireless-n have to be backwards compatible, but I don't know i thats true or not, so would our MACs be able to connect?
    The final question I have, is will the airport extreme work with a PS3? It can pick up the linksys router (when the router isn't crashing, which is rare), but is it compatible with it as well?
    I've read some reviews and it seems to get pretty good ratings, but I figured I'd ask my specific questions before making a purchase.
    Thanks.

    My first question is, my family is mixed between MACs and PCs. I was wondering if anyone has had any experience with using both systems on the router, and if they have had any major problems because of it (cutting out, or not getting signals at all)?
    Of course everyones situation is different, but I currently have no issues connecting both Macs and PCs to my 802.11n AirPort Extreme Base Station (AEBSn). I recently replaced a Linksys 802.11a/g wireless access point with the AEBSn. I left it in its default radio mode, which is "802.11n (b/g compatible)" and my Macs & PCs are a mix of 802.11n and g devices.
    But the other thing is, I'm usually in the basement, with the linksys on the first floor, which comes to roughly 70 feet away.
    The issue here is really wireless signal strength and the affects of Wi-Fi interference, specifically building material which may make it difficult to get a clean signal through ... so it will affect any manufacturer's router, not just the AirPorts. One test is to use a free utility, like iStumbler to measure the signal strength of your current router from the location in the basement. Any new router may perform as poorly if this signal level is very low.
    My next question is, that the MACs in my house are first gen. Intel based MACs, so I don't believe any of them have wireless-n support (I think you need Intel Core 2 Duo Processor, which ours are not), will that cause any problem with the wireless links,
    No, the AEBSn will support 802.11a/b/g/n clients.
    The final question I have, is will the airport extreme work with a PS3?
    The PS3 has built-in 802.11g so it should be compatible as well.

  • TOS and Server question(s​)..

    #1 Is it true that Verion has a don't ask don't tell policy when it comes to hosting a web server?
    #2 As long as it's a game server, a personal server, family and friends sort of operation, patched, not up to anything illegal, it won't be actively pursued?
    As also address(ed) on at least thes two threads:
    #1 http://www.dslreports.com/forum/r18029956-Will-thi​s-violate-Verizon-TOS]
    #2 http://www.dslreports.com/forum/remark,12835743
    I am not interested in hosting any kind of "server", I am just wondering.
    Thank for your time and understanding. 
    Message Edited by dslr595148 on 10-10-2008 04:24 PM
    Message Edited by dslr595148 on 10-10-2008 04:25 PM
    If you are the original poster (OP) and your issue is solved, please remember to click the "Solution?" button so that others can more easily find it. If anyone has been helpful to you, please show your appreciation by clicking the "Kudos" button.

    Shane P wrote:
    Wes,
    I have found that DNxHD is a nice option. I used the 720p 145 preset and it was about a 1min=1GB...is this normal?
    I brought it into PPCC and I was able to shuttle the video beautifully without the stuttering of the navite camera AVCHD.
    I would however like to keep the original frame size of 1080, I was looking through all of the DNxHD presets and there are no 1080p/59.94 presets. I tried to make a custom preset however all of the adjust parameters for the DNxHD presets are ghosted out. Are they not customizable?
    I apologize if these sound like "newb" questions as I have nevered used DNxHD prior to today.
    Create a new preset and choose "DNxHD MXF OP1a" from the Format popup
    Then from the Resolution popup in the Video Codec section you can choose from one of the 59.94p presets. It will set the frame rate to 59.94.
    If you have deeper questions about AME I recommend you ask on the AME forum. :-)

  • Increase Performance and ROI for SQL Server Environments

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, talking about solutions for hybrid cloud, and our proven solutions for Microsoft SQL Server and other Microsoft applications.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that lets you move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    These tools give you greater flexibility for managing and protecting important business applications.
    Chris Lemmons
    Director, EIS Technical Marketing, NetApp
    If your organization runs databases such as Microsoft SQL Server and Oracle DB, you probably know that these vendors primarily license their products on a "per-core" basis. Microsoft recently switched to "per-core" rather than "per-socket" licensing for SQL Server 2012 and 2014. This change can have a big impact on the total cost of operating a database, especially as core counts on new servers continue to climb. It turns out that the right storage infrastructure can drive down database costs, increase productivity, and put your infrastructure back in balance.
    In many customer environments, NetApp has noticed that server CPU utilization is low—often on the order of just 20%. This is usually the result of I/O bottlenecks. Server cores have to sit and wait for I/O from hard disk drives (HDDs). We've been closely studying the impact of all-flash storage on SQL Server environments that use HDD-based storage systems. NetApp® All Flash FAS platform delivers world-class performance for SQL Server plus the storage efficiency, application integration, nondisruptive operations, and data protection of clustered Data ONTAP®, making it ideal for SQL Server environments.
    Tests show that All Flash FAS can drive up IOPS and database server CPU utilization by as much as 4x. And with a 95% reduction in latency, you can achieve this level of performance with half as many servers. This reduces the number of servers you need and the number of cores you have to license, driving down costs by 50% or more and paying back your investment in flash in as little as six months.
    Figure 1) NetApp All Flash FAS increases CPU utilization on your SQL Server database servers, lowering costs.
    Source: NetApp, 2015
    Whether you're running one of the newer versions of SQL Server or facing an upgrade of an earlier version, you can't afford not to take a second look at your storage environment.
    End of Support for Microsoft SQL Server 2005 is Rapidly Approaching
    Microsoft has set the end of extended support for SQL Server 2005 for April 2016—less than a year away. With support for Microsoft Windows 2003 ending in July 2015, time may already be running short.
    If you're running Windows Server 2003, new server hardware is almost certainly needed when you upgrade SQL Server. Evaluate your server and storage options now to get costs under control.
    Test Methodology
    To test the impact of flash on SQL Server performance, we replaced a legacy HDD-based storage system with an All Flash FAS AFF8080 EX. The legacy system was configured with almost 150 HDDs, a typical configuration for HDD storage supporting SQL Server. The AFF8080 EX used just 48 SSDs.
    Table 1) Components used in testing.
    Test Configuration Components
    Details
    SQL Server 2014 servers
    Fujitsu RX300
    Server operating system
    Microsoft Windows 2012 R2 Standard Edition
    SQL Server database version
    Microsoft SQL Server 2014 Enterprise Edition
    Processors per server
    2 6-core Xeon E5-2630 at 2.30 GHz
    Fibre channel network
    8Gb FC with multipathing
    Storage controller
    AFF8080 EX
    Data ONTAP version
    Clustered Data ONTAP® 8.3.1
    Drive number and type
    48 SSD
    Source: NetApp, 2015
    The test configuration consisted of 10 database servers connected through fibre channel to both the legacy storage system and the AFF8080 EX. Each of the 10 servers ran SQL Server 2014 Enterprise Edition.
    The publicly available HammerDB workload generator was used to drive an OLTP-like workload simultaneously from each of the 10 database servers to storage. We first directed the workload to the legacy storage array to establish a baseline, increasing the load to the point where read latency consistently exceeded 20ms.
    That workload was then directed at the AFF8080 EX. The change in storage resulted in an overall 20x reduction in read latency, a greater than 4x improvement in IOPS, and a greater than 4x improvement in database server CPU utilization.
    Figure 2) NetApp All Flash FAS increases IOPS and server CPU utilization and lowers latency.
    Source: NetApp, 2015
    In other words, the database servers are able to process four times as many IOPS with dramatically lower latency. CPU utilization goes up accordingly because the servers are processing 4x the work per unit time.
    The All Flash FAS system still had additional headroom under this load.
    Calculating the Savings
    Let's look at what this performance improvement means for the total cost of running SQL Server 2014 over a 3-year period. To do the analysis we used NetApp Realize, a storage modeling and financial analysis tool designed to help quantify the value of NetApp solutions and products. NetApp sales teams and partners use this tool to assist with return on investment (ROI) calculations.
    The calculation includes the cost of the AFF8080 EX, eliminates the costs associated with the existing storage system, and cuts the total number of database servers from 10 to five. This reduces SQL Server licensing costs by 50%. The same workload was run with five servers and achieved the same results. ROI analysis is summarized in Table 2.
    Table 2) ROI from replacing an HDD-based storage system with All Flash FAS, thereby cutting server and licensing costs in half.
    Value
    Analysis Results
    ROI
    65%
    Net present value (NPV)
    $950,000
    Payback period
    six months
    Total cost reduction
    More than $1 million saved over a 3-year analysis period compared to the legacy storage system
    Savings on power, space, and administration
    $40,000
    Additional savings due to nondisruptive operations benefits (not included in ROI)
    $90,000
    Source: NetApp, 2015
    The takeaway here is that you can replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs, with the majority of the savings derived from the reduction in SQL Server licensing costs.
    Replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs.
    Maximum SQL Server 2014 Performance
    In addition to the ROI analysis, we also measured the maximum performance of the AFF8080 EX with SQL Server 2014. A load-generation tool was used to simulate an industry-standard TPC-E OLTP workload against an SQL Server 2014 test configuration.
    A two-node AFF8080 EX achieved a maximum throughput of 322K IOPS at just over 1ms latency. For all points other than the maximum load point, latency was consistently under 1ms and remained under 0.8ms up to 180K IOPS.
    Data Reduction and Storage Efficiency
    In addition to performance testing, we looked at the overall storage efficiency savings of our SQL Server database implementation. The degree of compression that can be achieved is dependent on the actual data that is written and stored in the database. For this environment, inline compression was effective. Deduplication, as is often the case in database environments, provided little additional storage savings and was not enabled.
    For the test data used in the maximum performance test, we measured a compression ratio of 1.5:1. We also tested inline compression on a production SQL Server 2014 data set to further validate these results and saw a 1.8:1 compression ratio.
    Space-efficient NetApp Snapshot® copies provide additional storage efficiency benefits for database environments. Unlike snapshot methods that use copy-on-write, there is no performance penalty; unlike full mirror copies, NetApp Snapshot copies use storage space sparingly. Snapshot copies only consume a small amount of storage space for metadata and additional incremental space is consumed as block-level changes occur. In a typical real-world SQL Server deployment on NetApp storage, database volume Snapshot copies are made every two hours.
    First introduced more than 10 years ago, NetApp FlexClone® technology also plays an important role in SQL Server environments. Clones are fully writable, and, similar to Snapshot copies, only consume incremental storage capacity. With FlexClone, you can create as many copies of production data as you need for development and test, reporting, and so on. Cloning is a great way to support the development and test work needed when upgrading from an earlier version of SQL Server. You'll sometimes see these types of capabilities referred to as "copy data management."
    A Better Way to Run Enterprise Applications
    The performance benefits that all-flash storage can deliver for database environments are significant: more IOPS, lower latency, and an end to near-constant performance tuning.
    If you think the performance acceleration that comes with all-flash storage is cost prohibitive, think again. All Flash FAS doesn't just deliver a performance boost, it changes the economics of your operations, paying for itself with thousands in savings on licensing and server costs. In terms of dollars per IOPS, All Flash FAS is extremely economical relative to HDD.
    And, because All Flash FAS runs NetApp clustered Data ONTAP, it delivers the most complete environment to support SQL Server and all your enterprise applications with capabilities that include comprehensive storage efficiency, integrated data protection, and deep integration for your applications.
    For complete details on this testing look for NetApp TR-4303, which will be available in a few weeks. Stay tuned to Tech OnTap for more information as NetApp continues to run benchmarks with important server workloads including Oracle DB and server virtualization.
    Learn more about NetApp solutions for SQL Server and NetApp All-flash solutions.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

    May 2015
    Explore
    The Buzz from Microsoft Ignite 2015
    NetApp was in full force at the recent Microsoft Ignite show in Chicago, talking about solutions for hybrid cloud, and our proven solutions for Microsoft SQL Server and other Microsoft applications.
    Hot topics at the NetApp booth included:
    OnCommand® Shift. A revolutionary technology that lets you move virtual machines back and forth between VMware and Hyper-V environments in minutes.
    Azure Site Recovery to NetApp Private Storage. Replicate on-premises SAN-based applications to NPS for disaster recovery in the Azure cloud.
    These tools give you greater flexibility for managing and protecting important business applications.
    Chris Lemmons
    Director, EIS Technical Marketing, NetApp
    If your organization runs databases such as Microsoft SQL Server and Oracle DB, you probably know that these vendors primarily license their products on a "per-core" basis. Microsoft recently switched to "per-core" rather than "per-socket" licensing for SQL Server 2012 and 2014. This change can have a big impact on the total cost of operating a database, especially as core counts on new servers continue to climb. It turns out that the right storage infrastructure can drive down database costs, increase productivity, and put your infrastructure back in balance.
    In many customer environments, NetApp has noticed that server CPU utilization is low—often on the order of just 20%. This is usually the result of I/O bottlenecks. Server cores have to sit and wait for I/O from hard disk drives (HDDs). We've been closely studying the impact of all-flash storage on SQL Server environments that use HDD-based storage systems. NetApp® All Flash FAS platform delivers world-class performance for SQL Server plus the storage efficiency, application integration, nondisruptive operations, and data protection of clustered Data ONTAP®, making it ideal for SQL Server environments.
    Tests show that All Flash FAS can drive up IOPS and database server CPU utilization by as much as 4x. And with a 95% reduction in latency, you can achieve this level of performance with half as many servers. This reduces the number of servers you need and the number of cores you have to license, driving down costs by 50% or more and paying back your investment in flash in as little as six months.
    Figure 1) NetApp All Flash FAS increases CPU utilization on your SQL Server database servers, lowering costs.
    Source: NetApp, 2015
    Whether you're running one of the newer versions of SQL Server or facing an upgrade of an earlier version, you can't afford not to take a second look at your storage environment.
    End of Support for Microsoft SQL Server 2005 is Rapidly Approaching
    Microsoft has set the end of extended support for SQL Server 2005 for April 2016—less than a year away. With support for Microsoft Windows 2003 ending in July 2015, time may already be running short.
    If you're running Windows Server 2003, new server hardware is almost certainly needed when you upgrade SQL Server. Evaluate your server and storage options now to get costs under control.
    Test Methodology
    To test the impact of flash on SQL Server performance, we replaced a legacy HDD-based storage system with an All Flash FAS AFF8080 EX. The legacy system was configured with almost 150 HDDs, a typical configuration for HDD storage supporting SQL Server. The AFF8080 EX used just 48 SSDs.
    Table 1) Components used in testing.
    Test Configuration Components
    Details
    SQL Server 2014 servers
    Fujitsu RX300
    Server operating system
    Microsoft Windows 2012 R2 Standard Edition
    SQL Server database version
    Microsoft SQL Server 2014 Enterprise Edition
    Processors per server
    2 6-core Xeon E5-2630 at 2.30 GHz
    Fibre channel network
    8Gb FC with multipathing
    Storage controller
    AFF8080 EX
    Data ONTAP version
    Clustered Data ONTAP® 8.3.1
    Drive number and type
    48 SSD
    Source: NetApp, 2015
    The test configuration consisted of 10 database servers connected through fibre channel to both the legacy storage system and the AFF8080 EX. Each of the 10 servers ran SQL Server 2014 Enterprise Edition.
    The publicly available HammerDB workload generator was used to drive an OLTP-like workload simultaneously from each of the 10 database servers to storage. We first directed the workload to the legacy storage array to establish a baseline, increasing the load to the point where read latency consistently exceeded 20ms.
    That workload was then directed at the AFF8080 EX. The change in storage resulted in an overall 20x reduction in read latency, a greater than 4x improvement in IOPS, and a greater than 4x improvement in database server CPU utilization.
    Figure 2) NetApp All Flash FAS increases IOPS and server CPU utilization and lowers latency.
    Source: NetApp, 2015
    In other words, the database servers are able to process four times as many IOPS with dramatically lower latency. CPU utilization goes up accordingly because the servers are processing 4x the work per unit time.
    The All Flash FAS system still had additional headroom under this load.
    Calculating the Savings
    Let's look at what this performance improvement means for the total cost of running SQL Server 2014 over a 3-year period. To do the analysis we used NetApp Realize, a storage modeling and financial analysis tool designed to help quantify the value of NetApp solutions and products. NetApp sales teams and partners use this tool to assist with return on investment (ROI) calculations.
    The calculation includes the cost of the AFF8080 EX, eliminates the costs associated with the existing storage system, and cuts the total number of database servers from 10 to five. This reduces SQL Server licensing costs by 50%. The same workload was run with five servers and achieved the same results. ROI analysis is summarized in Table 2.
    Table 2) ROI from replacing an HDD-based storage system with All Flash FAS, thereby cutting server and licensing costs in half.
    Value
    Analysis Results
    ROI
    65%
    Net present value (NPV)
    $950,000
    Payback period
    six months
    Total cost reduction
    More than $1 million saved over a 3-year analysis period compared to the legacy storage system
    Savings on power, space, and administration
    $40,000
    Additional savings due to nondisruptive operations benefits (not included in ROI)
    $90,000
    Source: NetApp, 2015
    The takeaway here is that you can replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs, with the majority of the savings derived from the reduction in SQL Server licensing costs.
    Replace your existing storage with All Flash FAS and get a big performance bump while substantially reducing your costs.
    Maximum SQL Server 2014 Performance
    In addition to the ROI analysis, we also measured the maximum performance of the AFF8080 EX with SQL Server 2014. A load-generation tool was used to simulate an industry-standard TPC-E OLTP workload against an SQL Server 2014 test configuration.
    A two-node AFF8080 EX achieved a maximum throughput of 322K IOPS at just over 1ms latency. For all points other than the maximum load point, latency was consistently under 1ms and remained under 0.8ms up to 180K IOPS.
    Data Reduction and Storage Efficiency
    In addition to performance testing, we looked at the overall storage efficiency savings of our SQL Server database implementation. The degree of compression that can be achieved is dependent on the actual data that is written and stored in the database. For this environment, inline compression was effective. Deduplication, as is often the case in database environments, provided little additional storage savings and was not enabled.
    For the test data used in the maximum performance test, we measured a compression ratio of 1.5:1. We also tested inline compression on a production SQL Server 2014 data set to further validate these results and saw a 1.8:1 compression ratio.
    Space-efficient NetApp Snapshot® copies provide additional storage efficiency benefits for database environments. Unlike snapshot methods that use copy-on-write, there is no performance penalty; unlike full mirror copies, NetApp Snapshot copies use storage space sparingly. Snapshot copies only consume a small amount of storage space for metadata and additional incremental space is consumed as block-level changes occur. In a typical real-world SQL Server deployment on NetApp storage, database volume Snapshot copies are made every two hours.
    First introduced more than 10 years ago, NetApp FlexClone® technology also plays an important role in SQL Server environments. Clones are fully writable, and, similar to Snapshot copies, only consume incremental storage capacity. With FlexClone, you can create as many copies of production data as you need for development and test, reporting, and so on. Cloning is a great way to support the development and test work needed when upgrading from an earlier version of SQL Server. You'll sometimes see these types of capabilities referred to as "copy data management."
    A Better Way to Run Enterprise Applications
    The performance benefits that all-flash storage can deliver for database environments are significant: more IOPS, lower latency, and an end to near-constant performance tuning.
    If you think the performance acceleration that comes with all-flash storage is cost prohibitive, think again. All Flash FAS doesn't just deliver a performance boost, it changes the economics of your operations, paying for itself with thousands in savings on licensing and server costs. In terms of dollars per IOPS, All Flash FAS is extremely economical relative to HDD.
    And, because All Flash FAS runs NetApp clustered Data ONTAP, it delivers the most complete environment to support SQL Server and all your enterprise applications with capabilities that include comprehensive storage efficiency, integrated data protection, and deep integration for your applications.
    For complete details on this testing look for NetApp TR-4303, which will be available in a few weeks. Stay tuned to Tech OnTap for more information as NetApp continues to run benchmarks with important server workloads including Oracle DB and server virtualization.
    Learn more about NetApp solutions for SQL Server and NetApp All-flash solutions.
    Quick Links
    Tech OnTap Community
    Archive
    PDF

  • Performance issue and functional question regarding updates on tables

    A person at my site wrote some code to update a custom field on the MARC table that was being copied from the MARA table.  Here is what I would have expected to see as the code.  Assume that both sets of code have a parameter called p_werks which is the plant in question.
    data : commit_count type i.
    select matnr zfield from mara into (wa_marc-matnr, wa_marc-zfield).
      update marc set zfield = wa_marc-zfield
         where werks = p_werks and matnr = wa_matnr.
      commit work and wait.
    endselect.
    I would have committed every 200 rows instead of every one row, but here's the actual code and my question isn't around the commits but something else.  In this case an internal table was built with two elements - MATNR and WERKS - could have done that above too, but that's not my question.
                DO.
                  " Lock the record that needs to be update with material creation date
                  CALL FUNCTION 'ENQUEUE_EMMARCS'
                    EXPORTING
                      mode_marc      = 'S'
                      mandt          = sy-mandt
                      matnr          = wa_marc-matnr
                      werks          = wa_marc-werks
                    EXCEPTIONS
                      foreign_lock   = 1
                      system_failure = 2
                      OTHERS         = 3.
                  IF sy-subrc <> 0.
                    " Wait, if the records not able to perform as lock
                    CALL FUNCTION 'RZL_SLEEP'.
                  ELSE.
                    EXIT.
                  ENDIF.
                ENDDO.
                " Update the record in the table MARC with material creation date
                UPDATE marc SET zzdate = wa_mara-zzdate
                           WHERE matnr = wa_mara-matnr AND
                                 werks = wa_marc-werks.    " IN s_werks.
                IF sy-subrc EQ 0.
                  " Save record in the database table MARC
                  CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
                    EXPORTING
                      wait   = 'X'
                    IMPORTING
                      return = wa_return.
                  wa_log-matnr   = wa_marc-matnr.
                  wa_log-werks   = wa_marc-werks.
                  wa_log-type    = 'S'.
                  " text-010 - 'Material creation date has updated'.
                  wa_log-message = text-010.
                  wa_log-zzdate  = wa_mara-zzdate.
                  APPEND wa_log TO tb_log.
                  CLEAR: wa_return,wa_log.
                ELSE.
                  " Roll back the record(un save), if there is any issue occurs
                  CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'
                    IMPORTING
                      return = wa_return.
                  wa_log-matnr   = wa_marc-matnr.
                  wa_log-werks   = wa_marc-werks.
                  wa_log-type    = 'E'.
                  " 'Material creation date does not updated'.
                  wa_log-message = text-011.
                  wa_log-zzdate  = wa_mara-zzdate..
                  APPEND wa_log TO tb_log.
                  CLEAR: wa_return, wa_log.
                ENDIF.
                " Unlock the record from data base
                CALL FUNCTION 'DEQUEUE_EMMARCS'
                  EXPORTING
                    mode_marc = 'S'
                    mandt     = sy-mandt
                    matnr     = wa_marc-matnr
                    werks     = wa_marc-werks.
              ENDIF.
    Here's the question - why did this person enqueue and dequeue explicit locks like this ?  They claimed it was to prevent issues - what issues ???  Is there something special about updating tables that we don't know about ?  We've actually seen it where the system runs out of these ENQUEUE locks.
    Before you all go off the deep end and ask why not just do the update, keep in mind that you don't want to update a million + rows and then do a commit either - that locks up the entire table!

    The ENQUEUE lock insure that another program called by another user will not update the data at the same time, so preventing database coherence to be lost. In fact, another user on a SAP correct transaction, has read the record and locked it, so when it will be updated your modifications will be lost, also you could override modifications made by another user in another luw.
    You cannot use a COMMIT WORK in a SELECT - ENDSELECT, because COMMIT WORK will close each and every opened database cursor, so your first idea would dump after the first update. (so the internal table is mandatory)
    Go through some documentation like [Updates in the R/3 System (BC-CST-UP)|http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCCSTUP/BCCSTUP_PT.pdf]
    Regards

  • Difference between Avg. Performance and Avg. Server Time

    When you go to the Create Reports in the Graph 1 tab, if you have Show All checked a lot of different statistics are shown per script. I don't really know the meaning for most of the items in the list. For this thread,
    Do you know what is the difference between Avg. Performance and Avg. Server Time?

    Avg Performance includes think time, server time does not. (As we understand it)

  • Adding OSX server to AD--binding questions and server type

    Currently my Macs are simply bound to AD, but I am going to add ML server to the mix so I can simplify management. After all my reading I am still unsure what to do with my current clients. I know the server is bound to both AD and OD, but does each client need to be bound to both?
    Regarding the server type is, should my OD be a master, replica, or relay?

    Yes, you have to bind the client and server to both OpenDirectory and ActiveDirectory. The configuration is commonly called "Magic Triangle". You'll find lots of examples online under that moniker.

  • SQL Performance and Security

    Help needed here please. I am new to this concept and i am working on a tutorial based on SQL performance and security. I have worked my head round this but now i am stuck.
    Here is the questions:
    1. Analyse possible performance problems, and suggest solutions for each of the following transactions against the database
    a) A manager of a project needs to inspect total planned and actual hours spent on a project broken down by activity.
    e.g     
    Project: xxxxxxxxxxxxxx
    Activity Code          planned     actual (to date)
         1          20          25
         2          30          30
         3          40          24
    Total               300          200
    Note that actual time spent on an activity must be calculated from the WORK UNIT table.
    b)On several lists (e.g. list or combo boxes) in the on-line system it is necessary to identify completed, current, or future projects.
    2. Security: Justify and implement solutions at the server that meet the following security requirements
    (i)Only members of the Corporate Strategy Department (which is an organisation unit) should be able to enter, update and delete data in the project table. All users should be able to read this information.
    (ii)Employees should only be able to read information from the project table (excluding the budget) for projects they are assigned to.
    (iii)Only the manager of a project should be able to update (insert, update, delete) any non-key information in the project table relating to that project.
    Here is the project tables
    set echo on
    * Changes
    * 4.10.00
    * manager of employee on a project included in the employee on project table
    * activity table now has compound key, based on ID dependence between project
    * and activity
    drop table org_unit cascade constraints;
    drop table project cascade constraints;
    drop table employee cascade constraints;
    drop table employee_on_project cascade constraints;
    drop table employee_on_activity cascade constraints;
    drop table activity cascade constraints;
    drop table activity_order cascade constraints;
    drop table work_unit cascade constraints;
    * org_unit
    * type - for example in lmu might be FACULTY, or SCHOOL
    CREATE TABLE org_unit
    ou_id               NUMBER(4)      CONSTRAINT ou_pk PRIMARY KEY,
    ou_name          VARCHAR2(40)     CONSTRAINT ou_name_uq UNIQUE
                             CONSTRAINT ou_name_nn NOT NULL,
    ou_type          VARCHAR2(30) CONSTRAINT ou_type_nn NOT NULL,
    ou_parent_org_id     NUMBER(4)     CONSTRAINT ou_parent_org_unit_fk
                             REFERENCES org_unit
    * project
    CREATE TABLE project
    proj_id          NUMBER(5)     CONSTRAINT project_pk PRIMARY KEY,
    proj_name          VARCHAR2(40)     CONSTRAINT proj_name_uq UNIQUE
                             CONSTRAINT proj_name_nn NOT NULL,
    proj_budget          NUMBER(8,2)     CONSTRAINT proj_budget_nn NOT NULL,
    proj_ou_id          NUMBER(4)     CONSTRAINT proj_ou_fk REFERENCES org_unit,
    proj_planned_start_dt     DATE,
    proj_planned_finish_dt DATE,
    proj_actual_start_dt DATE
    * employee
    CREATE TABLE employee
    emp_id               NUMBER(6)     CONSTRAINT emp_pk PRIMARY KEY,
    emp_name          VARCHAR2(40)     CONSTRAINT emp_name_nn NOT NULL,
    emp_hiredate          DATE          CONSTRAINT emp_hiredate_nn NOT NULL,
    ou_id               NUMBER(4)      CONSTRAINT emp_ou_fk REFERENCES org_unit
    * activity
    * note each activity is associated with a project
    * act_type is the type of the activity, for example ANALYSIS, DESIGN, BUILD,
    * USER ACCEPTANCE TESTING ...
    * each activity has a people budget , in other words an amount to spend on
    * wages
    CREATE TABLE activity
    act_id               NUMBER(6),
    act_proj_id          NUMBER(5)     CONSTRAINT act_proj_fk REFERENCES project
                             CONSTRAINT act_proj_id_nn NOT NULL,
    act_name          VARCHAR2(40)     CONSTRAINT act_name_nn NOT NULL,
    act_type          VARCHAR2(30)     CONSTRAINT act_type_nn NOT NULL,
    act_planned_start_dt     DATE,
    act_actual_start_dt      DATE,
    act_planned_end_dt     DATE,
    act_actual_end_dt     DATE,
    act_planned_hours     number(6)     CONSTRAINT act_planned_hours_nn NOT NULL,
    act_people_budget     NUMBER(8,2)      CONSTRAINT act_people_budget_nn NOT NULL,
    CONSTRAINT act_pk PRIMARY KEY (act_id, act_proj_id)
    * employee on project
    * when an employee is assigned to a project, an hourly rate is set
    * remember that the persons manager depends on the project they are on
    * the implication being that the manager needs to be assigned to the project
    * before the 'managed'
    CREATE TABLE employee_on_project
    ep_emp_id          NUMBER(6)     CONSTRAINT ep_emp_fk REFERENCES employee,
    ep_proj_id          NUMBER(5)     CONSTRAINT ep_proj_fk REFERENCES project,
    ep_hourly_rate      NUMBER(5,2)      CONSTRAINT ep_hourly_rate_nn NOT NULL,
    ep_mgr_emp_id          NUMBER(6),
    CONSTRAINT ep_pk PRIMARY KEY(ep_emp_id, ep_proj_id),
    CONSTRAINT ep_mgr_fk FOREIGN KEY (ep_mgr_emp_id, ep_proj_id) REFERENCES employee_on_project
    * employee on activity
    * type - for example in lmu might be FACULTY, or SCHOOL
    CREATE TABLE employee_on_activity
    ea_emp_id          NUMBER(6),
    ea_proj_id          NUMBER(5),
    ea_act_id          NUMBER(6),      
    ea_planned_hours      NUMBER(3)     CONSTRAINT ea_planned_hours_nn NOT NULL,
    CONSTRAINT ea_pk PRIMARY KEY(ea_emp_id, ea_proj_id, ea_act_id),
    CONSTRAINT ea_act_fk FOREIGN KEY (ea_act_id, ea_proj_id) REFERENCES activity ,
    CONSTRAINT ea_ep_fk FOREIGN KEY (ea_emp_id, ea_proj_id) REFERENCES employee_on_project
    * activity order
    * only need a prior activity. If activity A is followed by activity B then
    (B is the prior activity of A)
    CREATE TABLE activity_order
    ao_act_id          NUMBER(6),      
    ao_proj_id          NUMBER(5),
    ao_prior_act_id      NUMBER(6),
    CONSTRAINT ao_pk PRIMARY KEY (ao_act_id, ao_prior_act_id, ao_proj_id),
    CONSTRAINT ao_act_fk FOREIGN KEY (ao_act_id, ao_proj_id) REFERENCES activity (act_id, act_proj_id),
    CONSTRAINT ao_prior_act_fk FOREIGN KEY (ao_prior_act_id, ao_proj_id) REFERENCES activity (act_id, act_proj_id)
    * work unit
    * remember that DATE includes time
    CREATE TABLE work_unit
    wu_emp_id          NUMBER(5),
    wu_act_id          NUMBER(6),
    wu_proj_id          NUMBER(5),
    wu_start_dt          DATE CONSTRAINT wu_start_dt_nn NOT NULL,
    wu_end_dt          DATE CONSTRAINT wu_end_dt_nn NOT NULL,
    CONSTRAINT wu_pk PRIMARY KEY (wu_emp_id, wu_proj_id, wu_act_id, wu_start_dt),
    CONSTRAINT wu_ea_fk FOREIGN KEY (wu_emp_id, wu_proj_id, wu_act_id)
              REFERENCES employee_on_activity( ea_emp_id, ea_proj_id, ea_act_id)
    /* enter data */
    start ouins
    start empins
    start projins
    start actins
    start aoins
    start epins
    start eains
    start wuins
    start pmselect
    I have the tables containing ouins and the rest. email me on [email protected] if you want to have a look at the tables.

    Answer to your 2nd question is easy. Create database roles for the various groups of people who are allowed to access or perform various DML actions.
    The assign the various users to these groups. The users will be restricted to what the roles are restricted to.
    Look up roles if you are not familiar with it.

  • BPC 5.1 Performance and Dashboards

    Hi:
    I have BPC 5.1 Installation and cannot find the Performance feature installed.
    I have seen this in Demo VMware which is located at http://<<server>>/osoft/performance/default.aspx.
    I could not find this and when I checked my server installation I do not have performance folder in my web directory.
    Is there a separate installation for performance and dashboards?
    Thanks,
    Subramania

    I asked this question of product support a few months ago, and received the following response.
    Response (Alexandru Berindei) 12/06/2007 08:19 AM 
    Tim,
    I am sorry to say but it is not part of the product, it is technically a prototype/custom solution. It will not be supported in 5 nor become part of the commercial product.
    Regards,
    Laura Di Tomasso
    Solution Management, Corporate Performance Management

  • Abap proxies ( Client and Server proxies)

    Hi Team
    Good day to you. I am now started doing some example scenarios on ABAP proxies(ie Client and Server proxies). After going through the blogs which are avialable, i am writing this question to you for clarification.
    As per my understanding, the below are the required predefined settings which i need to do in my landscape to generate abap proxies(ie client and server proxies).
    My landscape includes the below systems.
    System A : SAP XI 3.0 system and
    System B : SAP R/3 on WAS 620
    SAP R/3 predefined Steps
    1.Create HTTP connection in the business system.
    2.Configuration Business system as local Integration Engine.
    3. Connection between Business System and System Landscape Directory.
    4. Maintaining the SAP J2EE Connection Parameters for LCRSAPRFC and SAPSLDAPI in SAP J2EE engine
    (Here in the step 4, i found the below needs to be done)
    1. Goto J2EE Engine
    2. Choose Cluster --> Server --> Services. JCo RFC provider
    3. Under RFC destination specify the following:
         Program ID: LCRSAPRFC
         Gateway Host: <Integration Server host>
         Gateway Service: <Integration Server gateway service>
         Number of process: 3
    4. Under Repository specify the following:
    5. Choose Set.
    Application Server: <Integration Server host>
    (i am not able to perform the steps which comes under point 4. so please Guide me how to goto J2EE engine and configure accordingly.
    5.  Maintain SLD access details in Transaction SLDAPICUST.
    As per my understanding, i need to do the above predefined configuration steps in SAP R/3 system (ie bussiness System) for doing Abap Client or Server proxies.
    And in the meantime, i would like to know whether i need to do any predefined configuration steps in XI 3.o system also. Please check and suggest me accordingly.
    Once i get clarification on predefined configuration steps, i will proceed with the example scenarios  on client and server proxies which are already in SDN.
    Thanks in advance.
    Regards
    Raj

    Hello Pavan
    thanks for your response.  you said that for the connection type 'H' we need to provide values for GATEWAY HOST and GATEWAY SERVICE but here i need to create the RFC destination of type 'T'. so please tell me whether i need to give the values for GATEWAY HOST and GATEWAY SERVICE for connection type 'T' also and the second thing is please tell me the difference between Application system and bussiness system accordingly to my landscape which i mentioned in my question.
    I am in little confusion because as per the requirement for abap proxies in the blog they mentioned that all these setting should be done in the bussiness system (ie SAP R/3) but you  are saying that Application system. so please clarify.
    My landscape which i am going to use in Abap proxy generation
    System A: XI 3.0
    System B: R/3
    Here which is bussiness system and which is application system. Pls calrify.
    Thanks in advance.
    Regards
    Raj

  • I want a SAP Financial Accounting and Controlling question,

    hi

    Hi
    Can u send me your personal Email ID, I will forward it again.
    Enterprise Structure
    What is a Company Code and what are the basic organizational
    assignments to a company code?
    Company Code is a legal entity for which financial statements like Profit
    and Loss and Balance Sheets are generated. Plants are assigned to the
    company code, Purchasing organization is assigned to the company code,
    and Sales organization is assigned to the company code.
    What is the relation between a Controlling Area and a Company
    code?
    A Controlling area can have the following 2 type of relationship with a
    Company code
    a. Single Company code relation
    b. Cross Company code relation
    This means that one single controlling area can be assigned to several
    different company codes. Controlling can have a one is to one
    relationship or a one is to many relationship with different company
    codes.
    Controlling Area is the umbrella under which all controlling activities of
    Cost Center Accounting, Product Costing, Profit Center and Profitability
    Analysis are stored.
    In a similar way Company Codes is the umbrella for Finance activities.
    How many Chart of Accounts can a Company code have?
    A single Company code can have only one Chart of Account assigned to
    it. The Chart of Accounts is nothing but the list of General Ledger
    Accounts.
    What are the options in SAP when it comes to Fiscal years?
    Fiscal year is nothing but the way financial data is stored in the system.
    SAP provides you with the combination of 12 normal periods and also
    four special periods. These periods are stored in what is called the fiscal
    year variant.
    There are two types of Fiscal Year Variant
    · Calendar Year – e.g. Jan-Dec
    · Year Dependent Fiscal Year .
    What is a year dependent fiscal year variant ?
    In a year dependent fiscal year variant the number of days in a month
    are not as per the calendar month. Let us take an example:- For the year
    2005 the period January ends on 29th, Feb ends on 27th, March ends on
    29. For the year 2006 January ends on 30th, Feb ends on 26th, March
    ends on 30th. This is applicable to many countries especially USA. Ever
    year this fiscal year variant needs to be configured in such a case
    How does posting happen in MM (Materials Management) during
    special periods?
    There is no posting which happens from MM in special periods. Special
    periods are only applicable for the FI module. They are required for
    making any additional posting such as closing entries, provisions. which
    happen during quarter end or year end.
    How many currencies can be configured for a company code?
    A company code can have 3 currencies in total. They are local currency
    ie company code currency) and 2 parallel currencies. This gives the
    company the flexibility to report in the different currencies.
    Do you require to configure additional ledger for parallel currencies?
    Where only 2 currencies are configured (Company code currency and a
    parallel currency) there is no need for an additional ledger. In case the
    third parallel currency is also configured and if it is different than the
    second currency type, you would then need to configure additional
    ledger.
    If there are two company codes with different chart of accounts how
    can you consolidate their activities?
    In this case you either need to write an ABAP program or you need to
    implement the Special Consolidation Module of SAP. If both the company
    codes use the same chart of accounts then standard SAP reports give
    you the consolidated figure.
    FI-GL
    Give some examples of GL accounts that should be posted
    automatically through the system and how is this defined in the
    system.
    Stock and Consumption accounts are instances of GL accounts that
    should be automatically posted . In the GL account master record, a
    check box exists wherein the automatic posting option is selected called “
    Post Automatically Only”
    What is a Account group and where is it used?
    An Account group controls the data that needs to be entered at the time
    of creation of a master record. Account groups exist for the definition of a
    GL account, Vendor and Customer master. It basically controls the fields
    which pop up during master data creation in SAP.
    What is a field status group?
    Field status groups control the fields which come up when the user does
    the transactions. There are three options for field selection. They are:
    Display only
    Suppressed
    Mandatory
    So basically you can have any field either for display only or you can
    totally suppress it or make it mandatory.
    The field status group is stored in the FI GL Master Record.
    What is the purpose of a “Document type” in SAP?
    A Document type is specified at the Header level during transaction entry
    and serves the following purposes:
    · It defines the Number range for documents
    · It controls the type of accounts that can be posted to eg
    Assets, Vendor, Customer, Normal GL account
    · Document type to be used for reversal of entries
    · Whether it can be used only for Batch input sessions
    Document Type is created for differentiating business transactions. Eg
    Vendor Invoice, Credit Memo, Accrual Entries,Customer Invoice. It is a
    two digit character.
    What is a Financial Statement Version?
    A FSV (Financial Statement Version) is a reporting tool and can be used
    to depict the manner in which the financial accounts like Profit and Loss
    Account and Balance Sheet needs to be extracted from SAP. It is freely
    definable and multiple FSV's can be defined for generating the output for
    various external agencies like Banks and other Statutory authorities.
    How are input and output taxes taken care of in SAP?
    A tax procedure is defined for each country and tax codes are defined
    within this. There is flexibility to either expense out the Tax amounts or
    Capitalize the same to Stocks.
    What are Validations and Substitutions?
    Validations/Substitutions in SAP are defined for each functional area
    e.g. FI-GL, Assets, Controlling etc at the following levels
    1. Document level
    2. Line item level
    These need to be specifically activated and setting them up are complex
    and done only when it is really needed. Often help of the technical team
    is taken to do that.
    Is it possible to maintain plant wise different GL codes?
    Yes. To be able to do so the valuation group code should be activated.
    The valuation grouping code is maintained per plant and is configured in
    the MM module. Account codes should be maintained per valuation
    grouping code after doing this configuration.
    Is Business area at company code Level?
    No. Business area is at client level. What this means is that other
    company codes can also post to the same business area.
    What are the different scenarios under which a Business Area or a
    Profit Center may be defined?
    This question is usually very disputable. But both Business Areas and
    Profit centers are created for internal reporting. Each has its own pros
    and cons but many companies nowadays go for Profit center as there is a
    feeling that business area enhancements would not be supported by SAP
    in future versions.
    There are typical month end procedures which need to be executed for
    both of them and many times reconciliation might become a big issue. A
    typical challenge in both of them is in cases where you do not know the
    Business Area or Profit Center of the transaction at the time of posting.
    What are the problems faced when a Business area is configured?
    The problem of splitting of account balance is more pertinent in case of
    tax accounts.
    Is it possible to default certain values for particular fields? For e.g.
    company code.
    Yes it is possible to default values for certain fields where a parameter id
    is present.
    Step 1 Go to the input field to which you want to make defaults.
    Step 2 Press F1, then click technical info push button. This would open
    a window that displays the corresponding parameter id (if one has been
    allocated to the field) in the field data section.
    Step 3 Enter this parameter id using the following path on SAP Easy
    access screen System à User profile à Own data.
    Step 4 Click on parameter tab. Enter the parameter id code and enter the
    value you want as default. Save the usersettings.
    Which is the default exchange rate type which is picked up for all
    SAP transactions?
    The default exchange rate type picked up for all SAP transactions is M
    (average rate)
    Is it possible to configure the system to pick up a different exchange
    rate type for a particular transaction?
    Yes it is possible. In the document type definition of GL, you need to
    attach a different exchange rate type.
    What are the master data pre-requisites for document clearing?
    The Gl Account must be managed as an ‘open item management’ . This
    checkbox is there in the General Ledger Master Record called Open Item
    Management. It helps you to manage your accounts in terms of cleared
    and uncleared items. A typical example would be GR/IR Account in SAP
    (Goods Received/Invoice Received Account)
    Explain the importance of the GR/IR clearing account.
    GR/IR is an interim account. In the legacy system of a client if the goods
    are received and the invoice is not received the provision is made for the
    same.
    In SAP at the Goods receipt stage the system passes an accounting entry
    debiting the Inventory and crediting the GR/IR Account .Subsequently
    when an invoice is recd this GR/IR account is debited and the Vendor
    account is credited. That way till the time that the invoice is not received
    the GR/IR is shown as uncleared items.
    How many numbers of line items in one single entry you can have?
    The number of line items in one document you can accommodate is 999
    lines.
    A Finance Document usually has an assignment field. This field
    automatically gets populated during data entry. Where does it get
    its value?
    This value comes from the Sort key entered in the Gl master record.
    How do you maintain the number range in Production environment?
    Do you directly create it in the Production box or do you do it by
    means of transport?
    Number range is to be created in the production client. You can
    transport it also by way of request but creating in the production client is
    more advisable.
    In customizing “company code productive “means what? What does
    it denote?
    Once the company code is live(real time transactions have started) this
    check box helps prevents deletion of many programs accidentally. This
    check box is activated just before go live.
    What is done by GR/IR regrouping program?
    The balance in a GR/IR account is basically because of 2 main types of
    transactions:-
    Goods delivered but invoice not received – Here the Goods receipt is
    made but no invoice has yet been received from the vendor. In such a
    scenario GR/IR account will have a credit balance.
    Invoiced received but goods not delivered – Here the Invoice is
    received from the vendor and accounted for, but goods have not been
    received. In such a scenario GR/IR account will have a debit balance.
    The GR/IR account would contain the net value of the above two types of
    transactions. The GR/IR regrouping program analyses the above
    transactions and regroups them to the correct adjustment account. The
    balance on account of first transactions will be regrouped to another
    liability account and the balance on account of second transactions will
    be regrouped to an asset account.
    What are the functionalities available in the financial statement
    version?
    In the financial statement version the most important functionality
    available is the debit credit shift. This is more important in case of
    Bank overdraft accounts which can have a debit balance or a credit
    balance. Thus in case of a debit balance you would require the overdraft
    account to be shown on the Asset side. In case of credit balance you
    would require the account to be shown on the Liability side.
    Is it possible to print the financial statement version on a SAPscript
    form?
    Yes. It is possible to print the financial statement version on a SAPscript
    form.
    How do you configure the SAPscript form financial statement
    version?
    It is possible to generate a form from the financial statement version and
    print the financial statements on a SAPscript form. In the customizing for
    financial statement version select the FSV you created and choose Goto
    à Generate form à One column or Two column form.
    You can also copy form from the standard system.
    Is it possible to generate a financial statement form automatically?
    Yes. It is possible to generate a form automatically.
    Is it possible to keep the FI posting period open only for certain GL
    codes?
    Yes. It is possible to keep open the FI posting period only for certain GL
    codes.
    How do you keep the FI posting period open only for certain GL
    codes?
    In transaction code OB52 click on new entries and maintain an interval
    or a single GL code for the account type S with the posting period
    variant. If the GL codes are not in sequence then you need to maintain
    further entries for the posting period variant and account type S.
    Can posting period variant be assigned to more than 1 company
    code?
    Yes. Posting period variant can be assigned to more than one company
    code.
    Accounts Receivable and Accounts
    Payable
    At what level are the customer and vendor codes stored in SAP?
    The customer and vendor code are at the client level. That means any
    company code can use the customer and vendor code by extending the
    company code view.
    How are Vendor Invoice payments made?
    Vendor payments can be made in the following manner:
    Manual payments without the use of any output medium like cheques
    etc.
    Automatic Payment program through cheques, Wire transfers, DME etc.
    How do you configure the automatic payment program?
    The following are the steps for configuring the automatic payment
    program:-
    Step 1 Set up the following:
    Co. code for Payment transaction
    Define sending and paying company code.
    Tolerance days for payable
    Minimum % for cash discount
    Maximum cash discount
    Special GL transactions to be paid
    Step 2 Set up the following:
    Paying company code for payment transaction
    Minimum amount for outgoing payment
    No exchange rate diff
    Separate payment for each ref
    Bill/exch payment
    Form for payment advice
    Step 3 Set up the following:
    Payment method per country
    Whether Outgoing payment
    Check or bank transfer or B/E
    Whether allowed for personnel payment
    Required master data
    Doc types
    Payment medium programs
    Currencies allowed
    Step 4 Set up the following:
    Payment method per company code for payment transactions
    Set up per payment method and co. code
    The minimum and maximum amount.
    Whether payment per due day
    Bank optimization by bank group or by postal code or no
    optimization
    Whether Foreign currency allowed
    Customer/Vendor bank abroad allowed
    Attach the payment form check
    Whether payment advice required
    Step 5 Set up the following:
    Bank Determination for Payment Transactions
    Rank the house banks as per the following
    Payment method, currency and give them ranking nos
    Set up house bank sub account (GL code)
    Available amounts for each bank
    House bank, account id, currency, available amount
    Value date specification
    Where do you attach the check payment form?
    It is attached to the payment method per company code.
    Where are Payment terms for customer master maintained?
    Payment terms for customer master can be maintained at two places i.e.
    in the accounting view and the sales view of the vendor master record.
    Which is the payment term which actually gets defaulted when the
    transaction is posted for the customer (accounting view or the sales
    view)?
    The payment term in the accounting view of the customer master comes
    into picture if the transaction originates from the FI module. If an FI
    invoice is posted (FB70) to the customer, then the payment terms is
    defaulted from the accounting view of the customer master.
    The payment term in the sales view of the customer master comes into
    picture if the transaction originates from the SD module. A sales order is
    created in the SD module. The payment terms are defaulted in the sales
    order from the sales view of the customer master.
    Where are Payment terms for vendor master maintained?
    Payment terms for Vendor master can be maintained at two places i.e. in
    the accounting view and the purchasing view.
    Which is the payment term which actually gets defaulted in
    transaction (accounting view or purchasing view)?
    The payment term in the accounting view of the vendor master comes
    into picture if the transaction originates from the FI module. If an FI
    invoice is posted (FB60) to the Vendor, then the payment terms is
    defaulted from the accounting view of the vendor master.
    The payment term in the purchasing view of the vendor master comes
    into picture if the transaction originates from the MM module. A
    purchase order is created in the MM module. The payment terms are
    defaulted in the purchase order from the purchasing view of the vendor
    master.
    Explain the entire process of Invoice verification from GR to Invoice
    verification in SAP with accounting entries?
    These are the following steps:
    A goods receipt in SAP for a purchased material is prepared referring a
    purchase order.
    When the goods receipt is posted in SAP the accounting entry passed is:-
    Inventory account Debit
    GR/IR account credit
    A GR/IR (which is Goods receipt/Invoice receipt) is a provision account
    which provides for the liability for the purchase. The rates for the
    valuation of the material are picked up from the purchase order.
    When the invoice is booked in the system through Logistics invoice
    verification the entry passed is as follows:-
    GR/IR account debit
    Vendor credit
    How are Tolerances for Invoice verification defined?
    The following are instances of tolerances that can be defined for Logistic
    Invoice Verification.
    c. Small Differences
    d. Moving Average Price variances
    e. Quantity variances
    f. Price variances
    Based on the client requirement, the transaction can be “Blocked” or
    Posted with a “Warning” in the event of the Tolerances being exceeded.
    Tolerances are nothing but the differences between invoice amount and
    payment amount or differences between goods receipt amount and
    invoice amount which is acceptable to the client.
    Can we change the reconciliation account in the vendor master?
    Yes. Reconciliation account can be changed in the vendor master
    provided that the authority to change has been configured. Normally we
    should not change the reconciliation account.
    What is the impact on the old balance when the reconciliation
    account in the vendor master is changed?
    Any change you make to the reconciliation account is prospective and
    not retrospective. The old items and balances do not reflect the new
    account only the new transactions reflect the account.
    There is an advance given by the customer which lies in a special GL
    account indicator A. Will this advance amount be considered for
    credit check?
    It depends on the configuration setting in the special GL indicator A. If
    the “Relevant to credit limit” indicator is switched on in the Special GL
    indicator A the advances will be relevant for credit check, otherwise it will
    not be relevant.
    In payment term configuration what are the options available for
    setting a default baseline date?
    There are 4 options available:-
    1) No default
    2) Posting date
    3) Document date
    4) Entry date
    What is generally configured in the payment term as a default for
    baseline date?
    Generally document date is configured in the payment term as a default
    for base line date.
    How do you configure a special GL indicator for Customer?
    You can use an existing special GL indicator ID or create a new one.
    After creating a special GL indicator id, update the chart of accounts and
    the Reconciliation account. Also as a last step you need to update the
    special GL code.
    The special GL code should also be marked as a Reconciliation account.
    Switch on the relevant for credit limit and commitment warning
    indicators in the master record.
    Bank Accounting:
    How is Bank Reconciliation handled in SAP?
    The bank reco typically follows the below procedure:
    First, the payment made to a Vendor is posted to an interim bank
    clearing account. Subsequently, while performing reconciliation, an entry
    is posted to the Main Bank account. You can do bank reconciliation
    either manually or electronically.
    How do you configure check deposit?
    The following are the steps for configuring check deposit:-
    Step1: Create account symbols for the main bank and incoming check
    account.
    Step2: Assign accounts to account symbols
    Step3: Create keys for posting rules
    Step4: Define posting rules
    Step5: Create business transactions and assign posting rule
    Step6: Define variant for check deposit
    What is the clearing basis for check deposit?
    In the variant for check deposit we need to set up the following
    a) fields document number ( which is the invoice number),
    b) amount
    c) Short description of the customer.
    The document number and the invoice amount acts as the clearing
    basis.
    How do you configure manual bank statement?
    The following are the steps for configuring manual bank statement:-
    Step1: Create account symbols for the main bank and the sub accounts
    Step2: Assign accounts to account symbols
    Step3: Create keys for posting rules
    Step4: Define posting rules
    Step5: Create business transaction and assign posting rule
    Step6: Define variant for Manual Bank statement
    How do you configure Electronic bank statement?
    The steps for Electronic Bank Statement are the same except for couple
    of more additional steps which you will see down below
    Step1: Create account symbols for the main bank and the sub accounts
    Step2: Assign accounts to account symbols
    Step3: Create keys for posting rules
    Step4: Define posting rules
    Step5: Create transaction type
    Step6: Assign external transaction type to posting rules
    Step7: Assign Bank accounts to Transaction types
    Fixed Assets
    What are the organizational assignments in asset accounting?
    Chart of depreciation is the highest node in Asset Accounting and this is
    assigned to the company code.
    Under the Chart of depreciation all the depreciation calculations are
    stored.
    How do you go about configuring Asset accounting?
    The configuration steps in brief are as follows:-
    a) Copy a reference chart of depreciation areas
    b) Assign Input Tax indicator for non taxable acquisitions
    c) Assign chart of depreciation area to company code
    d) Specify account determination
    e) Define number range interval
    f) Define asset classes
    g) Define depreciation areas posting to general ledger
    h) Define depreciation key
    Explain the importance of asset classes. Give examples?
    The asset class is the main criterion for classifying assets. Every asset
    must be assigned to only one asset class. Examples of asset class are
    Plant& Machinery, Furniture & Fixtures, Computers etc. The asset class
    also contains the Gl accounts which are debited when any asset is
    procured. It also contains the gl accounts for depreciation calculation,
    scrapping etc
    Whenever you create an asset master you need to mention the asset
    class for which you are creating the required asset. In this manner
    whenever any asset transaction happens the gl accounts attached to the
    asset class is automatically picked up and the entry passed.
    You can also specify certain control parameters and default values for
    depreciation calculation and other master data in each asset class.
    How are depreciation keys defined?
    The specifications and parameters that the system requires to calculate
    depreciation amounts are entered in Calculation methods. Calculation
    methods replace the internal calculation key of the depreciation key.
    Depreciation keys are defaulted in Asset Master from the asset class.
    Refer to the configuration for more details of how depreciation is
    calculated.
    A company has its books prepared based on Jan –Dec calendar year
    for reporting to its parent company. It is also required to report
    accounts to tax authorities based on April- March. Can assets be
    managed in another depreciation area based on a different fiscal
    year variant?
    No. Assets accounting module cannot manage differing fiscal year variant
    which has a different start date (January for book depreciation and April
    for tax depreciation) and different end date (December for book
    depreciation and March for tax depreciation). In this case you need to
    implement the special purpose ledger.
    What are the special steps and care to be taken in Fixed asset data
    migration into SAP system especially when Profit center accounting
    is active?
    Data migration is slightly different from a normal transaction which
    happens in Asset accounting module.
    Normally, in asset accounting the day to day transactions is posted with
    values through FI bookings and at the same time the asset reconciliation
    is updated online realtime. Whereas In data Migration the asset master
    is updated with values through a transaction code called as AS91. The
    values updated on the master are Opening Gross value and the
    accumulated depreciation. The reconciliation GL account is not
    automatically updated at this point of time.
    The reconciliation accounts (GL codes) are updated manually through
    another transaction code called as OASV.
    If profit center is active, then after uploading assets through AS91 you
    should transfer the asset balances to profit center accounting through a
    program.
    Thereafter you remove the Asset GL code (reconciliation accounts) from
    the 3KEH table for PCA and update the Asset reconciliation account (GL
    code) through OASV.
    After this step you again update the Asset reconciliation account in the
    3KEH table.
    The reason you remove the Asset reconciliation code from 3KEH table is
    that double posting will happen to PCA when you update the Asset
    reconciliation manually.
    Is it possible to calculate multiple shift depreciation? Is any special
    configuration required?
    Yes it is possible to calculate multiple shift depreciation in SAP for all
    types of depreciation except unit of production. No special configuration
    is required.
    How do you maintain multiple shift depreciation in asset master?
    The following steps are needed to maintain multiple shift depreciation:
    1. The variable depreciation portion as a percentage rate is to be
    maintained in the detail screen of the depreciation area.
    2. The multiple shift factor is to be maintained in the time dependent
    data in the asset master record. This shift factor is multiplied by
    the variable portion of ordinary depreciation.
    Once you have done the above the SAP system calculates the total
    depreciation amount as follows:-
    Depreciation amount = Fixed depreciation + (variable depreciation * shift
    factor)
    Let’s say you have changed the depreciation rates in one of the
    depreciation keys due to changes in legal requirements. Does
    system automatically calculate the planned depreciation as per the
    new rate?
    No. System does not automatically calculate the planned depreciation
    after the change is made. You need to run a program for recalculation of
    planned depreciation.
    What are evaluation groups?
    The evaluation groups are an option for classifying assets for reports or
    user defined match code (search code). You can configure 5 different
    evaluation groups. You can update these evaluation groups on to the
    asset master record.
    What are group assets?
    The tax requirements in some countries require calculation of
    depreciation at a higher group or level of assets. For this purpose you
    can group assets together into so-called group assets.
    What are the steps to be taken into account during a depreciation
    run to ensure that the integration with the general ledger works
    smoothly?
    For each depreciation area and company code, specify the following:
    1 The frequency of posting depreciation(monthly,quarterly etc)
    2 CO account assignment (cost center)
    3 For each company code you must define a document type for
    automatic depreciation posting: This document type requires its
    own external number range.
    4 You also need to specify the accounts for posting. (Account
    determination)
    Finally to ensure consistency between Asset Accounting and Financial
    Accounting, you must process the batch input session created by the
    posting report. If you fail to process the batch input session, an error
    message will appear at the next posting run.
    The depreciation calculation is a month end process which is run in
    batches and then once the batch input is run the system posts the
    accounting entries into Finance.
    How do you change fiscal year in Asset Accounting?
    n Run The fiscal year change program which would open new annual
    value fields for each asset. i e next year
    &#159; The earliest you can start this program is in the last posting period of
    the current year.
    &#159; You have to run the fiscal year change program for your whole
    company code.
    &#159; You can only process a fiscal year change in a subsequent year if the
    previous year has already been closed for business.
    Take care not to confuse the fiscal year change program with year-end
    closing for accounting purposes. This fiscal year change is needed only in
    Asset Accounting for various technical reasons.
    Is it possible to have depreciation calculated to the day?
    Yes it is possible. You need to switch on the indicator “Dep to the day” in
    the depreciation key configuration.
    Is it possible to ensure that no capitalization be posted in the
    subsequent years?
    Yes it is possible. You need to set it in the depreciation key
    configuration.
    How are Capital Work in Progress and Assets accounted for in SAP?
    Capital WIP is referred to as Assets under Construction in SAP and are
    represented by a specific Asset class. Usually depreciation is not charged
    on Capital WIP.
    All costs incurred on building a capital asset can be booked to an
    Internal Order and through the settlement procedure can be posted onto
    an Asset Under Construction. Subsequently on the actual readiness of
    the asset for commercial production, the Asset Under Construction gets
    capitalized to an actual asset.
    The company has procured 10 cars. You want to create asset
    masters for each of this car. How do you create 10 asset masters at
    the same time?
    While creating asset master there is a field on the initial create screen
    called as number of similar assets. Update this field with 10. When you
    finally save this asset master you will get a pop up asking whether you
    want to maintain different texts for these assets. You can update
    different details for all the 10 cars.
    FI-MM-SD Integration
    How do you go about setting the FI MM account determination ?
    FI MM settings are maintained in transaction code OBYC. Within these
    there are various transaction keys to be maintained like BSX, WRX,
    GBB, PRD etc. In each of these transaction keys you specify the GL
    accounts which gets automatically passed at the time of entry.
    Few examples could be: BSX- Stands for Inventory Posting Debit
    GBB-Standsfor Goods Issue/Scrapping/delivery
    of goods etc
    PRD- Stands for Price Differences.
    At what level is the FI-MM, FI-SD account determination settings?
    They are at the chart of accounts level.
    What are the additional settings required while maintaining or
    creating the GL codes for Inventory accounts?
    In the Inventory GL accounts (Balance sheet) you should switch on the
    ‘Post automatically only’ tick. It is also advisable to maintain the
    aforesaid setting for all FI-MM accounts and FI-SD accounts. This helps
    in preserving the sanctity of those accounts and prevents from having
    any difference between FI and MM, FI and SD.
    What is Valuation and Account assignment in SAP?
    This is actually the link between Materials Management and Finance.
    The valuation in SAP can be at the plant level or the company code level.
    If you define valuation at the plant level then you can have different
    prices for the same material in the various plants. If you keep it at the
    company code level you can have only price across all plants.
    Valuation also involves the Price Control .Each material is assigned to a
    material type in Materials Management and every material is valuated
    either in Moving Average Price or Standard Price in SAP. These are the
    two types of price control available.
    What is Valuation Class?
    The Valuation Class in the Accounting 1 View in Material Master is the
    main link between Material Master and Finance. This Valuation Class
    along with the combination of the transaction keys (BSX,WRX,GBB,PRD )
    defined above determine the GL account during posting.
    We can group together different materials with similar properties by
    valuation class. Eg Raw material,Finsihed Goods, Semi Finished
    We can define the following assignments in customizing :
    All materials with same material type are assigned to just one valuation
    class.
    Different materials with the same material type can be assigned to
    different valuation classes.
    Materials with different material types are assigned to a single valuation
    class.
    Can we change the valuation class in the material master once it is
    assigned?
    Once a material is assigned to a valuation class in the material master
    record, we can change it only if the stocks for that material are nil. If the
    stock exists for that material, then we cannot change the valuation class.
    In such a case, if the stock exists, we have to transfer the stocks or issue
    the stocks and make the stock nil for the specific valuation class. Then
    only we will be able to change the valuation class.
    Does the moving average price change in the material master during
    issue of the stock assuming that the price control for the material is
    Moving Average?
    The moving average price in the case of goods issue remains unchanged.
    Goods issue are always valuated at the current moving average price. It
    is only in goods receipt that the moving average price might change. A
    goods issue only reduces the total quantity and the total value in relation
    to the price and the moving price remains unchanged. Also read the
    next question to learn more about this topic.
    If the answer to the above question is ‘Yes’, then list the scenario in
    which the moving average price of the material in the material
    master changes when the goods are issued.
    The moving average price in the material master changes in the scenario
    of Split Valuation which is sometimes used by many organizations. If the
    material is subject to split valuation, the material is managed as Several
    partial stocks and each partial stock is valuated separately.
    In split valuation, the material with valuation header record will have ‘v’
    moving average price. This is where the individual stocks of a material
    are managed cumulatively. Here two valuation types are created, one
    valuation type can have ‘v’ (MAP) and the other valuation type can have
    ‘s’(standard price).
    In this case, whenever the goods are issued from the respective valuation
    types, always the MAP for the valuation header changes.
    What is the accounting entry in the Financial books of accounts
    when the goods are received in unrestricted use stock? Also
    mention the settings to be done in the ‘Automatic postings’ in SAP
    for the specific G/L accounts.
    On receipt of the goods in unrestricted-use stock, the Inventory account
    is debited and the GR/IR account gets credited. In customization, in the
    automatic postings, the Inventory G/L account is assigned to the
    Transaction event key BSX and the GR/IR account is assigned to the
    Transaction event key WRX.
    If a material has no material code in SAP, can you default the G/L
    account in Purchase order or it has to be manually entered?.
    If a material has no material code in SAP, we can still, default the G/L
    account with the help of material groups. We can assign the valuation
    class to a material group and then in FI-automatic posting, we can
    assign the relevant G/L account in the Transaction event key. The
    assignment of a valuation class to a material group enables the system to
    determine different G/L accounts for the individual material groups.
    What is the procedure in SAP for Initial stock uploading? Mention
    the accounting entries also.
    Initial stock uploading in SAP from the legacy system is done with
    inventory movement type 561( a MM transaction which is performed).
    Material valuated at standard price: For a material valuated at
    standard price, the initial entry of inventory data is valuated on the basis
    of standard price in the material master. If you enter an alternative value
    at the time of the movement type 561, then the system posts the
    difference to the price difference account.
    Material valuated at moving average price: The initial entry of
    inventory data is valuated as follows : If you enter a value when
    uploading the initial data, the quantity entered is valuated at this price.
    If you do not enter a value when entering initial data, then the quantity
    entered is valuated at the MAP present in the material master.
    The accounting entries are: Inventory account is debited and Inventory
    Historical upload account is credited.
    How do you configure FI-SD account determination?
    The FI-SD account determination happens through an access sequence.
    The system goes about finding accounts from more specific criteria to
    less specific criteria.
    This is the sequence it would follow:
    1) It will first access and look for the combination of Customer
    accounts assignment grp/ Material account assignment grp/
    Account key.
    2) If it does not find the accounts for the first combination it will look
    for Customer account assignment grp and account key
    combination.
    3) Furthermore, if it does not find accounts for the first 2 criteria’s
    then it will look for Material account assignment grp/Account key.
    4) If it does not find accounts for the all earlier criteria’s then finally it
    will look for Account key and assign the GL code.
    Thus posting of Sales Invoices into FI are effected on the basis of a
    combination of Sales organization, Account type, or Customer and
    Material Account assignment groups and following are the options
    available.
    a. Customer AAG/Material AAG/Account type
    b. Material AAG/Account type
    c. Customer AAG/Account type
    For each of this option you can define a Gl account. Thus the system
    uses this gl account to automatically pass the entries.
    Logistics Invoice Verification
    Can you assign multiple G/L accounts in the Purchase order for the
    same line item?
    Yes, we can assign multiple G/L accounts in the Purchase order for the
    same line item. The costs can be allocated on a percentage or quantity
    basis. If the partial goods receipt and partial invoice receipt has already
    taken place, then the partial invoice amount can be distributed
    proportionally, i.e. evenly among the account assigned items of a
    Purchase order. Alternatively the partial invoice amount can be
    distributed on a progressive fill-up basis, i.e. the invoiced amount is
    allocated to the individual account assignment items one after the other.
    What is Credit memo and subsequent debit in Logistics Invoice
    verification?
    The term credit memo refers to the credit memo from the vendor.
    Therefore posting a credit memo always leads to a debit posting on the
    vendor account. Credit memos are used if the quantity invoiced is higher
    than the quantity received or if part of the quantity was returned.
    Accounting entries are : Vendor account is debited and GR/IR account is
    credited.
    Subsequent debit : If a transaction has already been invoiced and
    additional costs are invoiced later, then subsequent debit is necessary. In
    this case you can debit the material with additional costs, i.e. GR/IR
    account debit and Vendor account credit. When entering the Subsequent
    debit, if there is no sufficient stock coverage, only the portion for the
    available stock gets posted to the stock account and rest is posted to the
    price difference account.
    What do you mean by Invoice parking, Invoice saving and Invoice
    confirmation?
    Invoice parking : Invoice Parking is a functionality which allows you to
    create incomplete documents and the system does not check whether the
    entries are balanced or not. An accounting documents is also not created
    when the invoice is in parked mode.
    Thus you can create incomplete documents and then post it later to
    accounting when you feel it is complete. You can even rectify the Parked
    invoice. This feature is used by many companies as on many occasions
    all data relating to the invoice might not be available.
    Invoice saving : This is also called Invoice processing or Invoice posting.
    The accounting document gets created when the invoice is posted in SAP.
    Invoice confirmation : There is no terminology in SAP as Invoice
    confirmation.
    What are Planned delivery costs and Unplanned delivery costs?
    Planned delivery costs: are entered at the time of Purchase order. At
    goods receipt, a provision is posted to the freight or customs clearing
    account.
    e.g. FRE is the account key for freight condition, hence the system can
    post the freight charges to the relevant freight revenue account and FR3
    is the account key for Customs duty, hence the system can post the
    customs duty to the relevant G/L account.
    These account keys are assigned to the specific condition types in the
    MM Pricing schema.
    In terms of Invoice verification : If the freight vendor and the material
    vendor is the same : then we can choose the option : Goods service items
    + Planned delivery costs.
    If the freight vendor is different from the material vendor: then for
    crediting only the delivery costs, we can choose the option: Planned
    delivery costs.
    Unplanned delivery costs: are the costs which are not specified in the
    Purchase order and are only entered when you enter the invoice.
    What is the basis on which the apportionment is done of unplanned
    delivery costs?
    Unplanned delivery costs are either uniformly distributed among the
    items or posted to a separate G/L account.
    For a material subjected to Moving average price, the unplanned delivery
    costs are posted to the stock account, provided sufficient stock coverage
    exists.
    For a material subjected to Standard price, the unplanned delivery costs
    are posted to the Price difference account.
    There are cases where Invoice verification is done first before the
    Goods receipt is made for the purchase order . In these cases with
    what values would the Goods receipt be posted ?
    Since the invoice verification has been done first the Goods Receipts will
    be valued with the Invoice value.
    FI Month End Closing Activities
    What are the Month End Closing Activities in Finance?
    1. Recurring Documents.
    a) Create Recurring documents
    b) Create Batch Input for Posting Recurring Documents
    c) Run the Batch Input Session
    2. Posting Accruals or Provisions entries at month end
    3. Managing the GR/IR Account-Run the GR/Ir Automatic Clearing
    4. Foreign Currency Open Item Revaluation-Revalue Open Items in
    AR.AP
    5. Maintain Exchange Rates
    6. Run Balance Sheets –Run Financial Statement Version
    7. Reclassify Payables and Receivables if necessary
    8. Run the Depreciation Calculation
    9. Fiscal Year Change of Asset Accounting if it is year end
    10. Run the Bank Reconciliation
    11. Open Next Accounting Period
    Controlling Module
    Explain the organizational assignment in the controlling module?
    Company codes are assigned to the controlling area. A controlling area
    is assigned to the operating concern.
    Controlling Area is the umbrella under which all controlling activities of
    Cost Center Accounting, Product costing, Profitability Analysis and Profit
    Center are stored.
    Operating Concern is the highest node in Profitability Analysis
    What is primary Cost element and secondary cost element?
    Every Profit and Loss GL account that needs to be controlled has to be
    defined as a cost element in SAP. Just as in FI General Ledger Accounts
    exist, in Controlling we have Cost element.
    Each FI General Ledger Account that is a Profit and Loss Account is also
    created as a Cost element in SAP.
    Primary Cost Elements are those which are created from FI general
    Ledger Accounts and impact the financial accounts eg. Travelling
    expenses, consumption account infact, any Profit and Loss GL account
    Secondary Cost Elements are those which are created only in
    controlling and does not affect the financials of the company. It is used
    for internal reporting only. The postings to these accounts do not affect
    the Profit or Loss of the company.
    The following categories exist for secondary cost elements:
    21 Internal Settlement:
    Cost elements of this category is used to settle order costs to objects in
    controlling such as cost centers, pa segments etc.
    31 Order/Results Analysis:
    Used to calculate WIP on the order/project
    41 Overhead
    Used to calculate indirect costs from cost centers to orders
    42. Assessment
    Used to calculate costs during assessment
    43 Internal Activity Allocation
    Used to allocate costs during internal activity allocation such as Machine
    Labour etc
    What are cost objects?
    A cost object means a cost or a revenue collector wherein all the costs or
    revenues are collected for a particular cost object. Examples of this could
    be cost center, production order, internal order, projects, sales order
    So whenever you look at any controlling function the basic thing you
    need to ask yourself is What is the cost element(expense) I want to
    control and what is the cost object ( i.e. either the production order, sales
    order, internal order) I am using to control this cost element. Sounds
    confusing read it again it is very simple
    Controlling is all about knowing the cost element and the cost
    object. Every time pose this question to yourself what is the cost
    element what is the cost object.
    At the end of the period all costs or revenues in the cost object are settled
    to their respective receivers which could be a gl account, a cost center,
    profitability analysis or asset.
    It is very important that you understand this otherwise you would
    struggle to understand Controlling.
    Cost Center Accounting:
    How is cost center accounting related to profit center?
    In the master data of the Cost Center there is a provision to enter the
    profit center. This way all costs which flow to the cost center are also
    captured in the profit center.
    Cost centers are basically created to capture costs e.g. admin cost center,
    canteen cost center etc
    Profit centers are created to capture cost and revenue for a particular
    plant, business unit or product line.
    What is a cost element group?
    Cost element group is nothing but a group of cost elements which help
    one to track and control cost more effectively. You can make as many
    number of cost element groups as you feel necessary by combining
    various logical cost elements.
    What is a cost center group?
    In a similar line the cost center group is also a group of cost centers
    which help one to track and control the cost of a department more
    effectively. You can make as many number of cost centers as you feel
    necessary by combining various logical cost centers
    Infact you can use various combinations of cost center group with the
    cost element group to track and control your costs per department or
    across departments
    What is the difference between Distribution and Assessment?
    Distribution uses the original cost element for allocating cost to the
    sender cost center. Thus on the receiving cost center we can see the
    original cost element from the sender cost center. Distribution only
    allocates primary cost.
    Assessment uses assessment cost element No 43 defined above to
    allocate cost. Thus various costs are summarized under a single
    assessment cost element. In receiver cost center the original cost
    breakup from sender is not available. Assessment allocates both primary
    as well as secondary cost.
    What are the other activities in Cost center?
    If you have a manufacturing set up, entering of Activity prices per cost
    center/activity type is an important exercise undertaken in Cost center
    accounting.
    What is an Activity Type?
    Activity types classify the activities produced in the cost centers.
    Examples of Activity Type could be Machine, Labour, Utilities
    You want to calculate the activity price through system? What are
    the requirements for that?
    In the activity type master you need to select price indicator 1 – Plan
    price, automatically based on activity.
    When activity price is calculated through system whether activity
    price is shown as fixed or variable?
    Normally when activity price is calculated through system it is shown as
    fixed activity price since primary cost are planned as activity independent
    costs.
    What is required to be done if activity price is to be shown both
    fixed and variable?
    In this case you need to plan both activity independent cost which are
    shown as fixed costs and activity dependent costs which are shown as
    variable costs.
    Is it possible to calculate the planned activity output through
    system?
    Yes. It is possible to calculate the planned activity output through
    system by using Long term Planning process in PP module.
    Explain the process of calculating the planned activity output
    through Long term planning?
    In Long term planning process the planned production quantities are
    entered for the planning year in a particular scenario. The Long term
    planning is executed for the scenario. This generates the planned activity
    requirements taking the activity quantities from the routing and
    multiplying with the planned production.
    The activity requirements are then transferred to the controlling module
    as scheduled activity quantities. Thereafter you execute a plan activity
    reconciliation which will reconcile the schedule activity and the activity
    you have planned manually. The reconciliation program updates the
    scheduled activity quantity as the planned activity in the controlling
    module.
    You want to revalue the production orders using actual activity
    prices. Is there any configuration setting?
    Yes. There is a configuration setting to be done.
    Where is the configuration setting to be done for carrying out
    revaluation of planned activity prices in various cost objects?
    The configuration setting is to be done in the cost center accounting
    version maintenance for fiscal year. This has to be maintained for version
    0. You need to select revalue option either using own business
    transaction or original business transaction.
    At month end you calculate actual activity prices in the system.
    You want to revalue the production orders with this actual activity
    prices. What are the options available in the system for revaluation?
    The options available are as follows:-
    You can revalue the transactions using periodic price, average price or
    cumulative price.
    Further you can revalue the various cost objects as follows:-
    Own business transaction – Differential entries are posted
    Original business transaction – The original business transaction is
    changed.
    Internal orders
    What is the purpose of defining Internal orders.?
    An example would help us understand this much better.
    Lets say in an organization there are various events such as trade fairs,
    training seminars, which occur during the year. Now lets assume for a
    second that these Trade fairs are organized by the Marketing cost center
    of the organization. Therefore in this case marketing cost center is
    responsible for all the trade fairs costs. All these trade fairs costs are
    posted to the marketing cost centers. Now if the management wants an
    analysis of the cost incurred for each of the trade fair organized by
    the marketing cost center how would the marketing manager get
    this piece of information across to them? The cost center report
    would not give this piece of info
    Now this is where Internal Order steps in .If you go through all cost
    center reports this information is not readily available since all the costs
    are posted to the cost center.
    SAP, therefore provides the facility of using internal orders which comes
    in real handy in such situations. In the above scenario the controlling
    department would then need to create an internal order for each of the
    trade fair organized. The cost incurred for each of the trade fair will be
    posted to the internal orders during the month. At the month end, these
    costs which are collected in the internal order will be settled from these
    orders to the marketing cost center. Thus the controlling person is now
    in a position to analyze the cost for each of the trade fair separately.
    Thus internal order is used to monitor costs for short term events,
    activities. It helps in providing more information than that is provided on
    the cost centers. It can be widely used for various purposes .
    How can you default certain items while creation of internal order
    master data?
    You can do so by creating a model order and then update the fields
    which you want to default in this model order. Finally attach this model
    order in the internal order type in the field reference order.
    Once the above is done whenever you create an internal order for this
    order type the field entries will get copied from the model order.
    What is the configuration setting for the release of the internal
    order immediately after creation?
    You have to check the “release immediately” check box in the
    internal order type.
    Product Costing
    What are the important Terminologies in Product Costing?:
    Results Analysis Key – This key determines how the Work in Progress is
    calculated
    Cost Components - The break up of the costs which get reflected in
    the product costing eg. Material Cost, Labour Cost, Overhead etc
    Costing Sheets - This is used to calculate the overhead in
    Controlling
    Costing Variant - For All manufactured products the price control
    recommended is Standard Price. To come up with this standard price for
    the finished good material this material has to be costed. This is done
    using Costing Variant. Further questions down below will explain this
    concept better.
    What are the configuration settings maintained in the costing
    variant?
    Costing variant forms the link between the application and Customizing,
    since all cost estimates are carried out and saved with reference to a
    costing variant. The costing variant contains all the control parameters
    for costing.
    The configuration parameters are maintained for costing type, valuation
    variants, date control, and quantity structure control.
    In costing type we specify which field in the material master should be
    updated.
    In valuation variant we specify the following
    a) the sequence or order the system should go about accessing
    prices for the material master (planned price, standard price,
    moving average price etc).
    b) It also contains which price should be considered for activity price
    calculation and .
    c) How the system should select BOM and routing.
    How does SAP go about costing a Product having multiple Bill of
    materials within it?
    SAP first costs the lowest level product, arrives at the cost and then goes
    and cost the next highest level and finally arrives at the cost of the final
    product.
    What does the concept of cost roll up mean in product costing
    context?
    The purpose of the cost roll up is to include the cost of goods
    manufactured of all materials in a multilevel production structure at the
    topmost level of the BOM(Bill of Material)
    The costs are rolled up automatically using the costing levels.
    1) The system first calculates the costs for the materials with the
    lowest costing level and assigns them to cost components.
    2) The materials in the next highest costing level (such as semifinished
    materials) are then costed. The costs for the materials
    costed first are rolled up and become part of the material costs of
    the next highest level.
    What is a settlement profile and why is it needed?
    All the costs or revenues which are collected in the Production order or
    Sales order for example have to be settled to a receiver at the end of the
    period. This receiver could be a Gl account, a cost center, profitability
    analysis or asset. Also read the question “What is a cost object “ in the
    section Controlling.
    In order to settle the costs of the production order or sales order a
    settlement profile is needed.
    In a settlement profile you define a range of control parameters for
    settlement. You must define the settlement profile before you can enter a
    settlement rule for a sender.
    The Settlement Profile is maintained in the Order Type and defaults
    during creating of order.
    Settlement profile includes:-
    1) the retention period for the settlement documents.
    2) Valid receivers GL account, cost center, order, WBS element, fixed
    asset, material, profitability segment, sales order, cost objects, order
    items, business process
    3) Document type is also attached here
    4) Allocation structure and PA transfer structure is also attached to the
    settlement profile e.g. A1
    The settlement profile created is then attached to the order type.
    What is Transfer or Allocation structure?
    The transfer structure is what helps in settling the cost from one cost
    object to the receiver. It is maintained in the Settlement profile defined
    above.
    The Transfer structure has 2 parts:
    a) Source of cost elements you want to settle
    b) Target receiver whether it is a Profitability segment or fixed asset or
    cost center
    So basically for settling the costs of a cost object you need
    to define the Transfer structure where you mention what
    are the costs you want to settle and the target receiver for
    that.
    This information you fit it in the settlement profile which
    contains various other parameters and this settlement
    profile is defaulted in the Order type. So every time a
    order is executed the relevant settlement rule is stored
    and at the month end by running the transaction of the
    settlement of orders all the cost is passed on to the
    receiver
    So to put in simple terms:
    a) You define your cost object which could be a
    production order a sales order for eg
    b) You collect costs or revenues for it
    c) You determine where you want to pass these costs or
    revenues to for eg if the sales order is the cost object
    all the costs or revenues of a sales order could be
    passed to Profitability Analysis
    What do you mean by primary cost compon

  • Possible to segment traffic between 2 interfaces? And other questions...

    I would like to set my G5 up as a server utilizing a second connection and to keep traffic seperated between this server connection and my regular internet connection (would be wireless). I'm pretty sure this alone is fairly straightforward and can be accomplished by setting up the new interface and moving it down to the bottom of the connection list with wireless at the top. That should keep all non-specific traffic from flowing out the ethernet/server connection - I think.
    If the above works the way I stated then I would also want to firewall ONLY the ethernet/server connection (the wireless has it's own hardware firewall). AND - this is the tricky part - I also want to add a fake interface that has a fake IP and bind that to the "real" ethernet/server connection. The reason for that is because I need a static IP to bind the service to. I know if the connection list thing works to flow the traffic that if I had an external router on the server connection, this wouldn't be needed. I'd already have a fake IP to bind to and I wouldn't have to run the firewall on the Mac. But I don't and I'd rather not have to buy one.
    So can this be done through the network/sharing preferance panes? If so, are there any "gotchas" I should be aware of? If not, is there any software tool out there that would make setting this up easier/faster? I'm not opposed to doing it all via command line, but I'm a bit rusty with my linux/unix admin knowledge. Plus I'm not 100% certain how to set all that up command line wise without screwing up OS X!
    Thanks.

    I'm not sure I fully understand what you are attempting to accomplish. Lets see if I have the general idea.
    You have a single G5, that you want to use as both your desktop machine and also to provided specific services, such as web, email, etc.
    You have some type of hardware firewall/security appliance.
    You have some type of wireless access point.
    You don't seem to have any type of router or switch in your configuration.
    You want all of your server based traffic to be sent and received on it's own Ethernet port. You want your personal Internet traffic to be sent and received on your wireless connection.
    So my questions are:
    Where is the server traffic going to, coming from? Who is accessing the server, is it users on the Internet, or just computers on your own LAN (which you didn't mention).
    If your server is to allow data from or send to the Internet, then you need to have a way to route the traffic there. Do you have more then one method to access the Internet, or will all traffic, both personal and server being going though the same Internet access pipe?
    If it is all going through the same pipe, and you only have the single computer, I don't understand why you wish to segment the traffic.
    If on the other hand you have multiple computers on your LAN. then segmenting traffic may make sense. This would allow access to your server and keep your LAN well secure.
    Anyway, to get to specifics, you'll need to use the terminal app to bind specific services to specific IP's and ports on your Mac. You will also need to manually configure the firewall to be able to select specific connection ports and bindings. However, while I think it can be done, I'm not sure it makes a great deal of sense.
    I would be more inclined to suggest a router or switch that can provide VLAN support, or a router that provides true DMZ support, would be a good way to go.
    Anyway, a little more info would be helpful.
    Oh and if I have this totally worng in what I think your doing.. My mistake.
    Tom N.

  • Sharepoint foundation and server 2013

    hi all
    i have scom 2012 sp1 on windows server 2008 r2. sql reporting services (mssql 2012 sp1) installed and configured on this server.
    OM, OMDW, scom`s reportserver db - on other server 2008 r2 (mssql 2012 sp1).
    i try to install management pack sharepoint foundation and server 2013 (use
    http://www.enduria.eu/the-sharepoint-2013-management-pack-and-my-experience-installing-it/).
    in scom console task status - success, but server (sharepoint 2013) in unidentified machines.
    in log on sp2013 server i get error:
    Cannot identify which SharePoint farm this server is associated with. Check the management pack guide for troubleshooting information.
    in log on scom management server i get same error every 10 minutes:
    Failed to deploy reporting component to the SQL Server Reporting Services server. The operation will be retried.
    Exception 'DeploymentException': Failed to deploy reports for management pack with version dependent id 'edf9e0b9-65aa-df29-6729-d16f0005e820'. Failed to deploy linked report 'Microsoft.SharePoint.Server_Performance_Report'. Failed to convert management pack
    element reference '$MPElement[Name="Microsoft.SharePoint.Foundation.2013.Responsetime"]$' to guid. Check if MP element referenced exists in the MP. An object of class ManagementPackElement with ID 75668869-f88c-31f3-d081-409da1f06f0f was not found.
    One or more workflows were affected by this. 
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Deployment.Report
    Instance name: d2781f95-d488-4e35-8226-a9e1d7127149
    Instance ID: {96379E66-14B2-B413-F73F-1F39806AF714}
    Management group: scom-service_group
    what could be the problem?
    thanks

    Same issue...running SCOM 2012 RTM with UR3 on Windows 2008 R2 and SharePoint 2013.  I got the discovery to work after editing the .config file.  All appeared to be working with the exception of the Configuration Databases and Content Databases
    which are currently in a Not monitored state (this could be by design?). 
    I believe this is a bug with the SharePoint 2013 Management Pack.  I tried importing the SharePoint 2010 MP to validate my SCOM environment and Report Server and did not get any errors and the reports imported fine.  Has anyone openened
    a case with Microsoft or know if this is a known issue?  Thanks!
    Error:
    Data Warehouse failed to deploy reports for a management pack to SQL Reporting Services Server. Failed to deploy reporting component to the SQL Server Reporting Services server. The operation will be retried.
    Exception 'DeploymentException': Failed to deploy reports for management pack with version dependent id 'edf9e0b9-65aa-df29-6729-d16f0005e820'. Failed to deploy linked report 'Microsoft.SharePoint.Server_Performance_Report'. Failed to convert management
    pack element reference '$MPElement[Name="Microsoft.SharePoint.Foundation.2013.Responsetime"]$' to guid. Check if MP element referenced exists in the MP. An object of class ManagementPackElement with ID 75668869-f88c-31f3-d081-409da1f06f0f was not
    found.
    One or more workflows were affected by this.
    Workflow name: Microsoft.SystemCenter.DataWarehouse.Deployment.Report
    Instance name: 9afddb34-5f35-40fa-8564-1de5d33773d3
    Instance ID: {4BF8AF22-E5E4-1FC6-3951-F731DCA65317}
    Only two SharePoint 2013 reports are showing up in SCOM:
    SharePoint Foundation Performance
    SharePoint Server Performance
    Missing the following reports (according to the product guide):
    Server Alert
    This report renders all alerts raised by specified server(s).
    Service Alert
    This report renders all alerts raised by specified service(s).
    Server Event
    This report renders all events raised on specified server(s).
    Service Event
    This report renders all events raised on specified service(s).
    Top Server Events
    This report renders top 20 events raised on specified server(s).
    Top Service Events
    This report renders top 20 events raised on specified service(s).
    Top Alerts
    This Most Common Alert Report helps to identify high volume alerts, the volume a distinct alert contributes to the total number of
    alerts and the resolution times. This report helps in tuning the alerts.
    Server Performance
    This report renders performance data for specified server(s).
    Entity State
    This report renders entity state for specified SharePoint object(s) over time.

Maybe you are looking for