Oracle Test Centre, for OCP SCM test, with 100 % discount voucher.

I am from Surat, India and have got a 100% free discount voucher from Oracle. Ahmedabad testing center does not have facility to allow to take exam with free discount voucher. Please help me locate a testing center, which allows to take exams with free discount vouchers, near to Surat and preferably in Bombay.

I have booked my test through Pearson vue authorized testing center.

Similar Messages

  • Generating 2 test reports for the same test - XML and HTML

    Hi All,
    I have a special test requirement.
    Need to generate 2 test reports for the same test. One in HTML and the other in XML format.
    Is there any direct method for that?
    How it can be implemented?
    Thanks in advance.
    SajK

    Hi SajK,
    To accomplish this task, you will need to modify the process model. For example, if you are using the sequential process model, you can concentrate on the TestReport sequence call and the various report related steps in the Single Pass and Test UUTs sequences.
    The TestReport sequence (also a callback) relies on Parameters.ReportOptions.Format to decide which of the following sequences to use when generating the report:
    <TestStand>\Components\NI\Models\TestStandModels\reportgen_txt.seq
    <TestStand>\Components\NI\Models\TestStandModels\reportgen_html.seq
    <TestStand>\Components\NI\Models\TestStandModels\reportgen_xml.seq
    In your case, instead of choosing one of the sequences, you could hard code it to use both the html and xml sequences. Be aware that for this to work, you must also modify and/or duplicate most of the report related steps in the Single Pass and Test UUTs sequences. For instance, the Set Report Format step becomes unnecessary since you are hard-coding the report format. On the other hand, the Write UUT Report step needs to be duplicated, one step to write to an html file location, and another to write to the xml file location.
    Please note that modifying the process mode in this way will deprecate some of the settings in the report options dialog box (such as the report format). For more information on TestStand report generation, please refer to the following DevZone article:
    Report Generation Explained
    Regards,
    Message Edited by James M on 05-29-2007 03:08 PM

  • SO with 100% Discount

    Hi Gurus,
    I work with FICO. recently I got a requirement from a client: SO with 100% discount.
    Via SD account determination I got the billing doc to be posted on Revenue Account and Discount Account. However the client wants to see a posting on Customer Account.
    I am thinking about to create 2 billing docs for one delivery:
    1. Customer invoice: customer acct to revenue acct
    2. Customer credit memo: discount acct to customer acct.
    is this possible? is there a user exit I could use? Will the second billing doc "credit memo" clear the first customer invoice?
    thank you so much for tips!
    Zoe

    Dear Zoe,
    As per your below comments
    I am thinking about to create 2 billing docs for one delivery:
    1. Customer invoice: customer acct to revenue acct
    2. Customer credit memo: discount acct to customer acct
    I would not recommend you to follow those steps as it is not the right practise. But I would suggest you below steps to achive.
    If you want to post SO with 100% Discount,then please follow the below steps
    Lets assume you have a Price (Revenue) condition type in your Pricing Procedure and You want to have one more Condtion type for Discount 100% (Sales Deduction). Ensure to maintain discount condition type step (Calcucation step) as of Price and Maintain 100% condition Record for Disocunt as well.
    While Processing SO System would show Price and Discount with same value and while posting invoice to accounting system will post revenue and discount to separate G/L Accounts. By this way you could be able achive the solution.
    But Kindly test this scenario in Sandbox or Dev!
    Regards
    Murali

  • Simulating Oracle I/O for storage layer testing

    There's an Open Source utility called Flexible I/O (called fio ) that is used by kernel and driver developers to test I/O.
    We would like to use this to create a very typical Oracle I/O load (async I/O, 8kb block reads, etc) for fio. The idea is that we can use this to test specific storage driver versions and driver parameters and kernel configurations options to determine stability and robustness and of course, performance - without having to deal with installing and setting up the Oracle software layer. We can also test new driver releases using this, without having to go to the effort of duplicating an Oracle instance and database and workloads on that database.
    It will also enable us to provide this as a test harness to storage vendors and driver developers for simulating a typical Oracle I/O load - and should trigger the same problems and issues that Oracle would if it was doing the I/O.
    Feasible? Or are there potential issues with this approach to be aware of?
    Any ideas what Oracle uses for testing I/O - like with Exadata Storage Cells and OFED drivers for example? What are other shops using to test storage systems and drivers and so on (there can be a number of moving parts on the storage layer and a test harness for this make sense)
    Any comments as to what fio parameters should be used to represent typical Oracle I/O?
    Will appreciate input on this. Thanks.

    Our intention is not really benchmarking - it is testing technical aspects of the I/O fabric layer. For example. fio can be used using shared memory, private, memory, huge pages and so on as the memory buffer for its I/O. I've seen kernel panics with huge pages and loads of I/O - so this can be tested for a specific lernel and driver version combo. What about scatter and gather (sg) reads - does Oracle use this method? The idea is to simulate the exact type of I/O that Oracle does, push it, and determine how well the I/O subsystem holds up.
    So we're not really looking at simulating database I/O, we're looking at simulating what happens lower down when the database makes an I/O call. Do we need to set the tablesize for sg reads? Does hugepages impact stability? But this will only be useful data if the actual I/O calls used matches very closely those made by Oracle.

  • Security Certificate issue in Application Testing Suite for EBS/Forms Test Script

    I am running an OpenScript EBS/Forms test in OATS and my test is failing when it tries to load the first screen.  It appears to be hitting a page stating that there is a problem with this website's security certificate.  I have not had this problem before, and would like some advice on fixing the issue.  Thanks.

    HI ,
    I am facing same problem. If anyone has any way out / work around for  the same, please share
    Thanks in advance for your support

  • Oracle XML Parser for PL/SQL - troubles with charset

    Hi,
    I'm using Oracle XML Parser for PL/SQL and have some troubles with charset of results xmldom.writeToBuffer and xmldom.writeToCLOB procedures.
    Some tags in my DOM documents contain text values in RUSSIAN (server nls_charset is CL8ISO8859P5). When I write document in varchar2 variable, buffer content is in UTF8 charset ( convert UTF8->CL8ISO8859P5 -OK).
    xmldom.setCharset(doc, 'ISO-8859-5') just after xmldom.newDOMDocument has no effect.
    xmldom.setCharset(doc, 'CL8ISO8859P5') has no effect also.
    Explicit charset direction in third parameter of
    xmldom.writeToBuffer and xmldom.writeToCLOB procedure has no effect.
    When I write document in CLOB, and then read part of CLOB in varchar2 buffer - result contain '?' in place of all russian text characters.
    What's a problem?
    How can I force XML Parser write XML in server charset?
    Oracle XML Parser for PL/SQL v 1.0.2

    I have the same problem. But in my case I am allowed only to use XML Parser for PL/SQL.
    Characterset 'WE8ISO8859P1' is used. And the language is latvian.
    After parsing a XML document and printing its contents, all latvian characters are replaced by "f".
    xmldom.setcharset(doc,'WE8ISO8859P1') has no effect.

  • Sales invoice with 100% discount

    Hi,
    Can you tell me if there is a way in SAP where the system allows a ZERO value to reflect in the customer account and the reconciliation account.
    The scenario is,  material A is sold at a price of $100 to customer. A 100% Discount is given and the journal entry that gets generated is
    Discount A/C Dr  $100
    Sales A/C     Cr   $100.
    The requirement is to reflect the Zero value(i.e after 100% discount) in the customer A/C and the recon A/C so that the accounting Document entry is
    Discount A/C  Dr  $100
    Customer A/C Dr $0.00
    Sales A/C      Cr   $100.
    When a discount of 99.99% is given, the customer a/c gets displayed with $0.01 value, but the aim is not to bill the customer, but to reflect the invoice in the customer A/C and the subsidiary ledger.
    Can you please advise if this is even achievable in the system.
    Thanks and Regards,
    Shilpa.

    Hello,
    In SD we have free goods scenario. Please check with your SD consultant if it would be possible to configure your scenario there.
    Normally it is not possible to configure term of payment with 100% cash discount.
    Best Regards,
    Raju

  • Hardware test CD for an e-mac with a 1.13GHz CPU

    The hardware cd that came with this e-mac tells me that it is not supported.
    Were do I find the download from Apple for the hardware cd that this e-mac will support and will work on it.
    Thanks
    George barlow

    There were no 1.13 GHz model eMacs; there were 700 and 800 MHz and 1.0. 1.25, and 1.42 GHz models. It would appear you have a hacked overclocked model. I have no information wheter AHT will work correctly with an overclocked Mac.
    There's no Apple source for downloading the AHT disc for any mac, as far as I know. There are third-party resellers who offer system discs for various Macs, or you can call the Apple store, wade through the voice mail to get a live operator, and ask about ordering an eMac disc as an out-of-production item. Note that if the eMac originally came with OS X on DVD (i.e., came with a Restore DVD rather than a set of CDs), AHT would be a aeparate bootable partition on the DVD that can be used via Startup Manager: Select a Startup Volume.

  • Test topic for duke dollars test case

    Have had issues reported regarding not being able to assign duke dollars. This is a test to see if there's an issue or not. -- DanG

    it should look like:
    Forum Home > New To Java Technology
    Topic: test to assign Duke dollars
           Duke Dollars
           If you'd like, assign Duke Dollars to this topic to encourange
           others to answer your question. [More Info]  at the top of this thread.

  • How to generate test data for all the tables in oracle

    I am planning to use plsql to generate the test data in all the tables in schema, schema name is given as input parameters, min records in master table, min records in child table. data should be consistent in the columns which are used for constraints i.e. using same column value..
    planning to implement something like
    execute sp_schema_data_gen (schemaname, minrecinmstrtbl, minrecsforchildtable);
    schemaname = owner,
    minrecinmstrtbl= minimum records to insert into each parent table,
    minrecsforchildtable = minimum records to enter into each child table of a each master table;
    all_tables where owner= schemaname;
    all_tab_columns and all_constrains - where owner =schemaname;
    using dbms_random pkg.
    is anyone have better idea to do this.. is this functionality already there in oracle db?

    Ah, damorgan, data, test data, metadata and table-driven processes. Love the stuff!
    There are two approaches you can take with this. I'll mention both and then ask which
    one you think you would find most useful for your requirements.
    One approach I would call the generic bottom-up approach which is the one I think you
    are referring to.
    This system is a generic test data generator. It isn't designed to generate data for any
    particular existing table or application but is the general case solution.
    Building on damorgan's advice define the basic hierarchy: table collection, tables, data; so start at the data level.
    1. Identify/document the data types that you need to support. Start small (NUMBER, VARCHAR2, DATE) and add as you go along
    2. For each data type identify the functionality and attributes that you need. For instance for VARCHAR2
    a. min length - the minimum length to generate
    b. max length - the maximum length
    c. prefix - a prefix for the generated data; e.g. for an address field you might want a 'add1' prefix
    d. suffix - a suffix for the generated data; see prefix
    e. whether to generate NULLs
    3. For NUMBER you will probably want at least precision and scale but might want minimum and maximum values or even min/max precision,
    min/max scale.
    4. store the attribute combinations in Oracle tables
    5. build functionality for each data type that can create the range and type of data that you need. These functions should take parameters that can be used to control the attributes and the amount of data generated.
    6. At the table level you will need business rules that control how the different columns of the table relate to each other. For example, for ADDRESS information your business rule might be that ADDRESS1, CITY, STATE, ZIP are required and ADDRESS2 is optional.
    7. Add table-level processes, driven by the saved metadata, that can generate data at the record level by leveraging the data type functionality you have built previously.
    8. Then add the metadata, business rules and functionality to control the TABLE-TO-TABLE relationships; that is, the data model. You need the same DETPNO values in the SCOTT.EMP table that exist in the SCOTT.DEPT table.
    The second approach I have used more often. I would it call the top-down approach and I use
    it when test data is needed for an existing system. The main use case here is to avoid
    having to copy production data to QA, TEST or DEV environments.
    QA people want to test with data that they are familiar with: names, companies, code values.
    I've found they aren't often fond of random character strings for names of things.
    The second approach I use for mature systems where there is already plenty of data to choose from.
    It involves selecting subsets of data from each of the existing tables and saving that data in a
    set of test tables. This data can then be used for regression testing and for automated unit testing of
    existing functionality and functionality that is being developed.
    QA can use data they are already familiar with and can test the application (GUI?) interface on that
    data to see if they get the expected changes.
    For each table to be tested (e.g. DEPT) I create two test system tables. A BEFORE table and an EXPECTED table.
    1. DEPT_TEST_BEFORE
         This table has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look BEFORE the
         test for that test case is performed.
         CREATE TABLE DEPT_TEST_BEFORE
         TESTCASE NUMBER,
         DEPTNO NUMBER(2),
         DNAME VARCHAR2(14 BYTE),
         LOC VARCHAR2(13 BYTE)
    2. DEPT_TEST_EXPECTED
         This table also has all EMP table columns and a TEST_CASE column.
         It holds EMP-image rows for each test case that show the row as it should look AFTER the
         test for that test case is performed.
    Each of these tables are a mirror image of the actual application table with one new column
    added that contains a value representing the TESTCASE_NUMBER.
    To create test case #3 identify or create the DEPT records you want to use for test case #3.
    Insert these records into DEPT_TEST_BEFORE:
         INSERT INTO DEPT_TEST_BEFORE
         SELECT 3, D.* FROM DEPT D where DEPNO = 20
    Insert records for test case #3 into DEPT_TEST_EXPECTED that show the rows as they should
    look after test #3 is run. For example, if test #3 creates one new record add all the
    records fro the BEFORE data set and add a new one for the new record.
    When you want to run TESTCASE_ONE the process is basically (ignore for this illustration that
    there is a foreign key betwee DEPT and EMP):
    1. delete the records from SCOTT.DEPT that correspond to test case #3 DEPT records.
              DELETE FROM DEPT
              WHERE DEPTNO IN (SELECT DEPTNO FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3);
    2. insert the test data set records for SCOTT.DEPT for test case #3.
              INSERT INTO DEPT
              SELECT DEPTNO, DNAME, LOC FROM DEPT_TEST_BEFORE WHERE TESTCASE = 3;
    3 perform the test.
    4. compare the actual results with the expected results.
         This is done by a function that compares the records in DEPT with the records
         in DEPT_TEST_EXPECTED for test #3.
         I usually store these results in yet another table or just report them out.
    5. Report out the differences.
    This second approach uses data the users (QA) are already familiar with, is scaleable and
    is easy to add new data that meets business requirements.
    It is also easy to automatically generate the necessary tables and test setup/breakdown
    using a table-driven metadata approach. Adding a new test table is as easy as calling
    a stored procedure; the procedure can generate the DDL or create the actual tables needed
    for the BEFORE and AFTER snapshots.
    The main disadvantage is that existing data will almost never cover the corner cases.
    But you can add data for these. By corner cases I mean data that defines the limits
    for a data type: a VARCHAR2(30) name field should have at least one test record that
    has a name that is 30 characters long.
    Which of these approaches makes the most sense for you?

  • How to execute a MTS (Master Test Script) in SAP ECATT Test Configuration for multiple variants.

    I have a MTS (Master Test Script) which references 4 Test Scripts. As of now, I am able to run this MTS directly by opening it as a Test Script and running it for default test data.
    Now, I want to run this MTS in SAP ECATT Test Configuration for multiple variants/test data. I am not able to see any parameter if I enter MTS in the Test Configuration.
    Below thread is similar to my requirement but I am not able to understand the solution properly.
    eCATT - how to run multiple test scripts with input variables in different test scripts?
    Any help in this case would be highly appreciated.
    Thanks & Regards,
    Vaibhav Gupta

    I have a MTS (Master Test Script) which references 4 Test Scripts. As of now, I am able to run this MTS directly by opening it as a Test Script and running it for default test data.
    Now, I want to run this MTS in SAP ECATT Test Configuration for multiple variants/test data. I am not able to see any parameter if I enter MTS in the Test Configuration.
    Below thread is similar to my requirement but I am not able to understand the solution properly.
    eCATT - how to run multiple test scripts with input variables in different test scripts?
    Any help in this case would be highly appreciated.
    Thanks & Regards,
    Vaibhav Gupta

  • Oracle Database Gateway for ODBC with Oracle XE

    Dear Colleague,
    Is it possible to use the Oracle Database Gateway for ODBC in conjunction with the Oracle Express Edition (or does one have to use the Standard or Enterprise Editions)? If yes, are there any restrictions when using the Oracle Database Gateway for ODBC in the context of Oracle XE?
    Best regards,
    Randy

    Hi,
    As it says, as long as you have an RDBMS license then you do not need a separate license for DG4ODBC. If the confusion is where it says "Oracle Database Gateway for ODBC can be installed and used on a machine different" then that is just to clarify that you can run the gateway on a completely separate machine from where the licensed RDBMS is running, but you can also run DG4ODBC on the same machine where the RDBMS is installed.
    If you install 11g DG4ODBC on the same machine where you have a 10g RDBMS then it must be installed into a separate ORACLE_HOME.
    If this is still not clear then please get back and let us know exactly what you need clarifying.
    Regards,
    Mike

  • Good strategies for building ATG EAR with Maven?

    It's been a while since I've looked at this. is there a clean set of choices for building an ATG-based EAR using Maven? Is the best we can do to use the antrun plugin, or is there a cleaner integration with Maven to do this? Is there reasonable documentation someone can point me to that details the strategies?

    Yes while using Maven, chances are that we may end up using the Ant task for invoking ATG's runAssembler inside pom to build the EAR. While using the Ant task within Maven helps us accomplishing the task but we tend to loose the power of Maven that way. It would be better if we either use Ant for the whole stuff or just Maven without mixing them.
    There is ATG DUST(Dynamo Unit and System Tests) framework for building JUnit tests for ATG applications which uses Maven.
    http://atgdust.sourceforge.net/first-test.html
    But I think DUST uses Maven just to build the JARs and does not uses ATG's assembler.
    I have never tried it but you may want to take a look at this Maven plugin which seems to be capable of generating ATG based EAR.
    http://jira.codehaus.org/browse/MOJO-1116
    Although it seem to be not updated off late. The plugin page mentions ATG 7.1 while the latest ATG offering is 10.1.1 But I guess it might work with a recent ATG version also (may be with some tweaks) as I don't think there would have been any major changes in the way ATG's runAssembler works and generates ATG EAR.

  • Repacking Oracle 9i client for distrubution

    Can anyone provide assistance? I have made a few MSI's of this and they do not work very well. Am currently trying to follow Oracle's instructions for their software packager with no luck at all. I keep banging my head looking for osb.bat (which is nowhere). I am following the instructions located on
    http://otn.oracle.com/software/products/osp/htdocs/readmeotn.htm
    Can anyone tell me where I can get ahold of a MSI or any way to use Oracle's software packager?
    Thanks
    Chris
    [email protected]

    AFAIK, nothing. Download it again. :((
    The whole file was not downloaded. So, you will have to start again.

  • Jobs for ocp

    Could there be a entry level jobs for OCP - Forms developer with no commercial experience in UK preferably london, is there any other way to find out like forum, placement site or any other means ?

    user7553741 wrote:
    Could there be a entry level jobs for OCP - Forms developer with no commercial experience in UK preferably london, is there any other way to find out like forum, placement site or any other means ?There is always a chance but i suspect it is relatively thin. You should be able to find some job agencies to see what they say .... and ignore the most optimistic .... You may get lucky but in my opinion be prepared to be disappointed.

Maybe you are looking for