BW testing strategy

Hi All,
I have requirement to create a BW testing strategy for any CRs received.
It has to cover general step by step procedure to be followed to test and document all the tests conducted on changes made in BW.
As a QA team lead, I have to prepare this plan to be followed further.
We work on BW 3.5 and use Mercury Quality center.
I am struck with how and where to start this.Please provide me with any test templates or general guidelines for testing.
Your help would be appreciated and rewarded with points.
Thanks in advance
Pavan

Hi,
We have not used any tool like Mercury Quality center for testing. In our case we created some test scripts and based on that we proceeded further.
I will give you some standard tests we have to do in Quality.
All the testing is done in quality before moving into Prod.
Testing like Integration and unit testing are done in quality.
You need to test if the extractions are working fine, data is getting loaded into targets, post load activities like activation, rollup, compression etc are successfull etc.
We test the PC chains also in Quality.
We schedule / trigger the chains and monitor them. We ensure that all the process types are happening successfully.
One major area we concentrate on is to identify dependcies in the chain. This will ensure less erros and also reduce chain timings by a great extend. Dependencies should be kept to a minimum whereever possible.
Also in cases where we use events/third party shceduling tools to trigger BW chains we check that out.
Executing reports to check the definitions are correct.
The volume of data will be lesser in Quality compared to Prod.
Also activities like Change runs,Deletion , Selective deletion etc are done.
Almost all the activites in Prod are done in Quality.
Refer.
Re: Quality System
Re: bw developer_unit_integration_testing
Integration Testing
http://www.sap-img.com/general/role-of-sap-consultant-in-testing.htm
integration testing
SAP testing
SAP Testing
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e5d8d390-0201-0010-e284-e418e3f0e150
Can any one explain about the SAP Testing process in Implementation Project
BI 7.0 Testcases
/thread/796460 [original link is broken]
Hope this helps.
Thanks,
JituK

Similar Messages

  • Testing strategy after legal merge of two company codes

    Hi All,
    This is my first integration project. Please help me out with my query:
    My company is going through a merger and we have merged two company codes.
    for e.g. company code 1 is in ABC company and company code 2 is in XYZ company. ABC is buying XYZ company. This means that company code 1 will remain and company code 2 will merge into company code 1. They are going to bring in the open line item AR/AP data from company code 2 into company code 1.
    Any of you guys who have worked on an integration project before please guide me to some documents online or state some kind of testing strategy to adopt to test after the legal merge has taken place. I am mostly looking for a strategy to test intercompany process.
    Thanks,
    MS

    Hi
    There are lot of issues that you need to take care of:
    1. is there any relation bwteeen the two company codes before merger. have they done any transactions between them? If yes, what wewould be the accounting treatment for the same. Would it be transferred at cost or Cost+Price (Transfer Price)
    2. If you are in India, there would be effects of taxation like excise and vat calculations
    3. One of the most important issue, is that of the new fiscal year. What strategy would you follow for the merged entity? In Sap, changing a fiscal year is a critical issue. Request you to refer SAP Help for the same. There are notes available for the same
    4. What would be the strategy for Asset valuations? Are there different asset valuation procedures followed in the two company codes?
    Thanks & regards
    Sanil K Bhandari

  • Generic testing Strategy in ODI

    Can somebody help me with any document related to generic testing strategy in ODI?

    Hi ,
    ODI provide the option to export / import ODI object in form of XML file. So , You can partial export some object that you want then import to the production env. As my exp , Sometime it has some issue about internal object ID link that make a ruin to the repository on the production. Because all the object in ODI was linked together with "Internal ID" that related between object.
    With this method , you cannot track what's the different between prod and dev environment until you create a version of that object and use compare version utilty in designer.
    Hope this help ,
    Somchai

  • Testing Strategy

    Hi,
    I'd like to know the best practices while preparing a test strategy document for CRM-OD. I have done it for Siebel but i understand in OD it should be quite simpler. Any comments/ documents which could be shared please.
    Thanks,
    Smita

    Smita, click on the Training and Support link in the upper right, then enter "Rollout Center" in the Search. This gives a step by step plan including testing for your CRM On Demand implementation.

  • Testing Strategy HCM

    Hi All,
    Im preparing Testing Strategy Document.To Enchace that i need some document related to Testing Strategy.Can anybody send me template or document related to this.
    with regards,
    satish

    Hi,
    Go through this link.
    http://help.sap.com/saphelp_sm40/helpdata/en/f4/3f9ef359a711d1bc84080009b4534c/frameset.htm

  • Apex Testing Strategy

    Folks,
    We are looking for some help defining best practice for testing APEX applications.
    Any pre-written strategy documents available? All and any input much appreciated.
    Thanks
    Andy

    Hey Tulley
    Any particular class of testing?
    e.g. Scalability, Performance, Database Objects & Logic, Functional, Security?
    If your answer is 'all of the above', I'd document each of these categories individually as the first step of your 'best practise.'
    Within each, there's likely to be a bunch of techniques you can use and a bunch of relevant commercial and open source tools that'll save you some time applying these techniques, for instance, LoadRunner for Scalability & Performance and utPLSQL for PL/SQL unit testing.
    That's how I'd write a best practice document anyway. I'm afraid I'm not aware of anything that's already been written.
    Rgds
    Ben

  • SAP BW on HANA test strategy

    All,
    Can someone guide me the best practices test scenarios for a Tyipcal BW on HANA projects ?.
    I need some information like why do we need to testing ?
    How does it help executing testing for BW on HANA ?
    What are the test scenarios ?
    Thanks,
    HA

    Hi HA,
    you may want to take a look at this nice document:
    Automatic Test of Queries with RS Trace Tool
    - Lars

  • Testing Process for Gathering Single Object stats.

    Hello Oracle Experts,
    I work a critical system and due to some high stakes all and every change is very heavily scrutinized here whatever the level is. And one of such changes which is currently under scrutiny is gathering object stats for single objects. Just to give you a background its an Oracle eBusiness site so fnd_stats is used instead of usual dbms_stats and we've an inhouse job that depending on the staleness of the objects gather stats on them using FND_STATS. (RDBMS : 10.2.0.4 Apps Release 12i).
    Now, we've seen that occasionally it leaves some of the objects that should ideally be gathered so they need to be gathered individually and our senior technical management wants a process around it - for gathering this single object stats (I know!). I think I need to explicitly mention here that this need to gather stale object stats has emerged becs one of the plans has gone pretty poor (from 2 ms to 90 mins) and sql tuning task states that stats are stale and in our PROD copy env (where the issue exists) gathering stats reverts to original good plan! So we are not gathering just because they are stale but instead because that staleness is actually causing a realtime problem!
    Anyway, my point is that it has been gathered multiple times in the past on that object and also it might get gathered anytime by that automatic job (run nightly). There arguments are:
    i. There may be several hundred sql plans depending on that object and we never know how many, and to what, those plan change and it can change for worse causing unexpected issues in the service!
    ii. There may be related objects whose objects have gone stale as well (for example sales and inventory tables both see related amount of changes on column stock_level) and if we gather stats only on one of them and since those 2 cud be highly related (in queries etc.) that may mess up the join cardinality etc. messing up the plans etc.
    Now, you see they know Oracle as well !
    My Oracle (and optimizer knowledge) clearly suggests me that these arguments are baseless BUT want to keep an open mind. So my questions are :
    i. Do the risks highlighted above stand any ground or what probably do you think is there of happening any of the above?
    ii. Any other point that I can make to convince the management.
    iii. Or if those guys are right, Do you guys use or recommend any testing strategy/process that you can suggest to us pls?
    Another interesting point is that, they are not even very clear at this stage how they are gonna 'test' this whole thing as the 'cost' option like RAT (Real Application Testing) is out of question and developing an inhouse testing tool still need analyzing in terms of efforts, worth and reliability.
    In the end, Can I request top experts from the 'Oak Table' network to make a comment so that I can take their backings!? Well I am hoping here they'll back me up but that may not necessarily the case and I obviously want an honest expert assessment of the situation and not merely my backing.
    Thanks so much in advance!

    >
    I work a critical system and due to some high stakes all and every change is very heavily scrutinized here whatever the level is.
    Another interesting point is that, they are not even very clear at this stage how they are gonna 'test' this whole thing as the 'cost' option like RAT (Real Application Testing) is out of question and developing an inhouse testing tool still need analyzing in terms of efforts, worth and reliability.Unfortunately your management's opinion of their system as expressed in the first paragraph is not consistent with the opinion expressed in the second paragraph.
    Getting a stable strategy for statistics is not easy, requires careful analysis, and takes a lot of effort for complex systems.
    >
    In the end, Can I request top experts from the 'Oak Table' network to make a comment so that I can take their backings!? Well I am hoping here they'll back me up but that may not necessarily the case and I obviously want an honest expert assessment of the situation and not merely my backing.
    The ideal with stats collection is to do something simple to start with, and then build on the complex bits that are needed - something along the lines suggested by Dan Morgan works: a table driven approach to deal with the special cases which are usually: the extreme indexes, the flag columns, the time-based/sequential columns, the occasional histogram, and new partitions. Unfortunately you can't get from where you are to where you need to be without some risk (after all, you don't know which bits of your current strategy are causing problems).
    You may have to progress by letting mistakes happen - in other words, when some very bad plans show up, work out WHY they were bad (missing histogram, excess histogram, out of date high values) to work out the minimum necessary fix. Put a defensive measure in place (add it to the table of special cases) and run with it.
    As a direction to aim at - I avoid histograms unless really necessary, I like introducing function-based indexes where possible, and I'm perfectly happy to write small programs to fix columns stats (low/high/distinct) or index stats (clustering_factor/blevel/distinct_keys) and create static histograms.
    Remember that Oracle saves old statistics when you create new ones, so any new stats that cause problems can be reversed out very promptly.
    Regards
    Jonathan Lewis

  • Testing (all_rows/first_rows)

    What is the best optimiser mode to use, all rows or first rows... in the following scenario..
    Experimental testing of a new (Relativley small) Database before it is rolled out.... testing where indexes can improve performance generally (no specific bottle knecks known)
    Im under the impression that first rows shouwl only be used when a query can be singled out that takes a relativley long time to run, where as in the situation where we are testing a database , or optimising it in general then all rows would be a better testing strategy?
    Any comments? (other than use both)

    An excellent discussion (with lots of real life examples) on the topic can be found in the following thread...
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::p11_question_id:3310002340673
    vr,
    Sudhakar B.

  • Weblogic stress test

    Hi,
    I have a forms 11g applications that have been installed on an HP ProLiant DL360 G7 (64GB)
    with Oracle Weblogic 1.0.3.3 64 Bit / Oracle Fusion Middleware 11.1.1.3 64 Bit(RHEL 5.5 64-Bit).
    I guess the machine is not bad for 300-400 concurrent users.
    Some tests have been done by few users (10-20), and all works fine so far.
    We are planing to do 1-2 hours tests with more users , around 100.
    Could someone give some advices about which checks should be done or what should be monitored during the tests ?
    any ideas are welcome.
    Best Regards
    Frane

    Hi Frane,
    The list is quite long but finds below the most critical performance items you should monitor during your load test exercise:
    ## Physical server and OS related
    -     CPU utilization
    -     Physical memory
    -     Disk performance and paging activity
    -     Java process memory size of your 64-bit VM (also important to monitor C Heap size of your Java process, check for any native memory leak over time etc.) and look at your physical RAM requirement per Java process under peak load
    ## JRE
    -     Java Heap utilization / footprint
    -     PermGen space utilization / footprint (if using HotSpot VM)
    -     Native memory / C Heap utilization (monitoring technique will depend of your OS)
    -     Garbage collection performance/elapsed time/frequency etc.
    ## Weblogic side
    -     Threads utilization
    -     JDBC Data Source utilization and leak verification (if applicable)
    -     HttpSession utilization (if applicable)
    -     EJB/MDB pool bean utilization (if applicable)
    -     JMS queue utilization (if applicable)
    - Cluster performance e.g. look for any packet loss etc. (if applicable)
    -     Socket / File Descriptor utilization (via netstat / lsof commands)
    ## Other recommendations
    -     A few Thread Dump snapshots should be captured during peak load and analysed; especially if high thread utilization is observed
    -     Heap Dump could be generated before, during and after load test in order to understand your application Java Heap footprint and ensure no leak
    -     If using Oracle database, AWR reports should be generated during your load test and investigate any SQL contention / tuning opportunity such as sub optimal execution plan, lack of proper indexes etc.
    - Negative testing should be added for 1 or 2 load testing cycles in order to simulate TX slowdown and physical downtime of one or many of your downstream system should be done along with timeout validation (ensure no infinite hanging Thread and domino effect)
    On top of this, you should track your application business response time and compare between each cycle.
    Your load testing strategy should include an initial run to capture a baseline, each tuning or change applied should be then compared to your initial baseline in order to validate any improvement or degradation of any performance figure.
    Regards,
    P-H
    http://javaeesupportpatterns.blogspot.com/

  • Oracle  9.2.0.1.0 to 10g Upgradation testing

    Hi All,
    We are upgrading our database from Oracle 9i to 10g.
    Our development environment is Ascential Datastage ETL Tool
    Our project has been already developed by some other team and now only database up gradation is happening.
    Need some clarifications and solutions
    1)     How we can get to know all these objects are properly upgraded
    2)     What can be the best testing strategy for testing all these database objects like tables, functions, procedures , views, sequences etc. If any similar document has been prepared by any of the projects, please send to my mail id
    3)     Is there anything we need to do change in any of the objects to make compatible with 10g. like any functions that has to be replaced or datatype mismatching etc …..
    4)     The type of testing needs to be done for different database objects.

    Anyone familiar with the way English is spoken in America, excluding one small enclave in North Carolina perhaps, should understand a couple of things:
    1. I understand a bit of Hindi so I understood some of the background conversation while on the phone with the spammers.
    2. The individuals speaking to me were as dumb as a bag of hammers.
    Now according to Wikipedia the planet has about 370 million people who speak Hindi as a first language and another 580 who speak it as a second language. 950 million people minus two seems a reasonable ratio.
    Anyone looking too hard to find an insult can likely find it looking at their own reflection in the mirror. Grow up and get over it folks.
    BTW: Don ... you are the one that referred people to your website claiming it contained the list that you use. Taking you at face value it appears that your list does not include doing a backup. Perhaps you are so expert in the practice that you "understand" that but if you are going to refer novices to your site you need to offer them a complete list along with the ubiquitous advertising.
    Anyone wanting my list can get it for free ... here it is:
    1. Read the relevant Oracle documents
    2. Go to metalink and research any problems others have had with the procedure
    3. Create an RDA, zip it, and have it ready to upload to metalink
    4. Perform a level 0 backup using RMAN
    5. Verify the backup is valid and can be used for a restore
    6. Start flashback logging if not already running
    7. Create a guaranteed restore point
    8. Follow the directions at Step 1 except as modified by docs read in Step 2.
    Not one of those steps is in your list Don? And, quite frankly, I think people would be ill-advised to consider steps on what to do for "SLOW PERFORMANCE AFTER UPGRADE" any guidance on performing the upgrade itself.
    But heck ... anytime you can send people to a page full of advertisements why miss the chance.
    People interested in tuning would be well advised to go to one of these URLs:
    http://www.jlcomp.demon.co.uk/faq/ind_faq.html#Performance_and_Tuning
    http://www.hotsos.com/oop.html

  • SAP Testing for BI Lifecycle

    Hi,
    Could you guide me to SAP BI Testing Strategy and Approaches.
    Regards
    Satish Talikota

    Hi
    Unit Testing: In this we do all the technical tests like testing the data flow, logic in transformation, data loads, query properties etc
    Integration Testing: In this we perform the function test to check that query is giving output which is functionally correct.
    Regression Testing: If we are making changes to existing flow and query, it this testing one would like to test that no existing functionality has changed because of newly added changes.
    Performance Testing: sometimes we write ABAP code in transformations, under this cap you may like to test the data load performance. If data load is taking a long time, you may want to optimize the code and check the performance again.
    Authorization Test: in this you would like to test all the roles have been set up properly and data level access is also working fine at query level.
    Please check the below,
    https://scn.sap.com/docs/DOC-28811
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60981d00-ca87-2910-fdb8-d4a2640d69d4?quicklink=index&overridelayout=true
    Note:Try to Search in the forum before Pasting Thread
    Regards
    NK

  • Testing web applications?

    Hi,
    What do you recommend for testing web applications? The best I could find is Jakarta Cactus but it seems like a half-dead project (releases once every 2-3 years). Is there anything better out there?
    Thank you,
    Gili

    cowwoc wrote:
    Turns out neither Selenium nor HtmlUnit will do what I want. Both focus exclusively on testing the client-side browser-oriented output of webapps, Selenium does indeed use the browser to generate tests, but they can be saved as JUnit and run either individually or using JMeter. I think of it as more of a test automation tool. Once I capture a test with Selenium I don't have to keep entering values in a browser. Nice for business acceptance test automation.
    whereas I need to focus on testing the server-side XML-oriented output of my servlets. Sounds more like unit testing to me.
    For example, I want to launch a new server instance, populate the database, issue a request and verify the resulting database contents and/or output to the client. Selenium and HtmlUnit simply don't handle that sort of thing.I think you're better off with more unit testing in that case. You shouldn't have to launch a server to test. That's what POJOs are for. Seeding a database with test data, and rolling it back when you're done, is something that you can do with DbUnit or Spring's tests that are transactional.
    Maybe your test strategy needs some refinement. Or I don't understand it. But I'd want a test strategy that didn't depend on servlets or app servers. Those should just be deployment issues, not part of determining correct behavior.
    %

  • Re: Testing issues

    Hello Neil,
    Well, why don't you use different repositories for testing your
    packages. First, you make a delivery workspace in your development
    repository and export it in a referentiel/versionning directory.
    After, you import your workspace into the test repository.
    For the suppliers, you can do the same if you want to manage
    different releases between packages.
    If you are talking about integration testing, there is no way to
    manage versions in the repository and managing it by workspaces is very
    dangerous, also for your developement repository (space side effects
    for example, or workspaces which can stay far from the base line).
    All this can be managed by scripts and so be a little automated.
    When a correction is made, it should be in the developement repository
    and delivered with a new release of the package (it can be several
    projects).
    I do this way from 2 years in Forte and more than 6 years in other
    languages with no problem and real seperation between integration,
    performance testing and developement. Just remember that during your
    integration testing programmers continue to develop.
    Last thing : all this can be done in the same environment and your
    testing or integration reposiroties can be local.
    Hope this helps.
    Daniel Nguyen
    Freelance Forte Consulting.
    Neil Gendzwill wrote:
    >
    We're in the middle of our first big Forte project and the issue has
    arisen for package testing. By "package" we mean a collection of classes
    which provide some function or service or just logically belong together.
    If package X is under test and needs package Y, X's tester often
    cannot use the real Y. He will need to stub out Y in places to provide
    the inputs his test needs, or perhaps development on Y is incomplete and
    there are compile errors or missing pieces.
    How is this done with Forte? Does anyone have a good testing strategy?
    If this were C++, we could just link with the stubs rather than the real
    deal. But because of the repository, we're forced to copy things back and
    forth from plans to avoid linking with the real classes.
    So far, we've come up with the idea of creating a test plan and then
    taking copies of everything needed. But that's kind of ugly: for one
    thing, you need to keep making your changes in the original plan and
    importing them into the test plan, because if you do it the other way
    around you tend to break inheritance (at least, at the class level - I
    suppose you could copy individual changed methods). Another idea is to
    make copies of all of our plans which hold versions of the classes, then
    have them supply the test plan rather than the normal plans. That way
    people can share stubs. If there's some common, elegant way to do this
    I'd like to hear about it.
    ========================================================================
    Neil Gendzwill, Senior Software Engineer, SED Systems, Saskatoon, Canada
    E-MAIL: [email protected] PHONE: (306) 933-1571 FAX: (306) 933-1486

              "Brian Mitchell" <[email protected]> wrote:
              >
              >Lionel <[email protected]> wrote:
              >>Are we doing something incorrectly?
              >
              >Lionel,
              >
              >Have you attempted to view the JNDI tree of Managed Servers from Admin
              >console?
              Yes, and the War deployment doesn't show up on the managed server JNDI trees.
              However the application shows as having been successfully deployed.
              > Which protocol being used to access the PROVIDER_URL? (t3, t3s, etc.)
              >
              >
              I've tried both t3://orionCluster:7001 and t3://beaServerA:5001,beaServerB:5003
              (same IPAddress, different ports), with no luck.
              >Brian J. Mitchell
              >Systems Administrator, TRX
              >email: [email protected]
              >
              >----------------------------------------------------------------------
              >PGP key: http://pgp.mit.edu:11371/pks/lookup?op=get&search=0x8BEE2240
              >Fingerprint: 3A84 A5D6 EE26 BC10 E525 1905 6FF0 0B6F 8BEE 2240
              >----------------------------------------------------------------------
              >
              

  • Scheduling Agreement

    Hello Guru it is very very urgent plz give me answer what to do
    This problem occurs in program Y_EX34_LVED4FZZ_001.  The problem is that the current code assumes that VBEP-ZZSID will never be repeated by the customer, so all deliveries in the history can be checked when deciding whether or not to retain a schedule line.  New information has confirmed that Nissan repeats their ZZSID numbers annually, which leads the current code to delete open schedule lines that had deliveries against them in prior years.  Here are the recommended steps to correct this:
    1.     After obtaining the VBEP data and before selecting from ZZDLVRANSID, find out when the relevant schedule line’s delivery schedule was posted, using the following steps:
    a.     Find the record in VBEH for VBEP-VBELN, VBEP-POSNR, and VBEP-ETENR that has the smallest nonzero value in the internal delivery schedule number field (VBEH-ABRLI).  If none is found, take the record with ABRLI = ‘0000’.
    b.     Find the record in VBLB for VBELN and POSNR that matches ABRLI.
    c.     Take fields VBLB-ERDAT and VBLB-ABRDT_ORG.
    d.     If ABRDT_ORG is not 00/00/0000 or 12/31/9999, use it as the comparison date for this logic.  Otherwise, use ERDAT.
    2.     When selecting later from LIKP, also get the planned and actual Goods Issue dates (LIKP-WADAT and LIKP-WADAT_IST).  If WADAT_IST is not 00/00/0000, use it for the comparison date for this logic.  Otherwise, use WADAT.
    3.     Before incrementing XLTOT_PGI or XLTOT_DEL, ensure that the delivery you are working with was shipped after the schedule line that you’re working with was received (the VBLB date from step 1d is less than or equal to the shipment date from step 2).  If it is, proceed with the calculations.  If it isn’t, don’t add the quantities from this delivery into the totals.
    Testing Strategy
    ·     In CTA, we will need to coordinate with the EDI team to find out if any customers exist that use replace logic for which data can be pushed in and tested.
    ·     In FTA, scheduling agreement 300027861 is an excellent example of this issue.  We can get the EDI team to re-send data for that scheduling agreement (or for other scheduling agreements under the same customer), and process the IDOCs individually to test that:
    o     Deliveries that were shipped prior to the open demand’s delivery schedule creation date are excluded from the assessment of whether or not to delete the schedule line.
    Deliveries that were shipped after the delivery schedule came in are included in the assessement
    regards
    sapman

    Only ABAPER will give the good result, but no body is taking risk
    it is very urgent case
    sapman

Maybe you are looking for

  • Error in BSP HAP_DOCUMENT

    Hello All, I have to disply the data of appraisal as appraisee on ess .For this i useed page documents_recieved of bsp HAP_DOCUMENT and attached it to Iview on the ESS portal .but once i am clicking on the ivew i am getting following error : The foll

  • Not opening Adobe when I print to it

    I print to adobe alot from excel and word...However, I do not want adobe to open when I do this.  I just want the document in pdf form so i can e-mail.  How can I tell adobe to not open when I print to it from another program. Thanks for any help.

  • Cfmail attachments not working

    So I have a web page that... 1 Generates an excel file. 2 Saves the excel file to my webserver. 3 Sends an email with the excel file attached. If the user opens the email via desktop outlook the attached excel file will open and display just fine. If

  • Gateway Server Issue

    Hello, Following is the issue that I'm having. I've 2 management servers and ADCS configured on Active Directory. I've a couple of servers in the workgroup that I need to monitor. I didn't configured gateway servers because there are only 2 workgroup

  • Why my Pages downloaded my documents every time I opened it?, why my Pages downloaded my documents every time I open it?

    I linked my icloud to Pages , but every time I opened it, it downloaded all my staffs which I have downloaded before.