Schema refresh in the test DB

Dear All/Aman,
I have one database which having two schema, every day I am taking the schema level export and importing the same to my test server.
Its taking time to import everyday inspite there is little changes in the schema.
can I use the INCTYPE for the icremental import to reduce the time.
the intention is to refresh the test schema every day.
could you please provide some suggestion to minimize the Schema refreshment time.
Is incremental impot available in 10g for schema level or do i need to go for RMAN.
Please provide your input as earliest.

The script which I have posted is just an example which i thought to going through that.
the setup is like that: everyday from the production DB there is two schema getting export and import to the test DB.
The test db is UAT where user needs to do any DML operation on that, however the business requirement is needs to get refreshed these two schema every day at certain time.
So, my requirement is : whatever production DB is populating the data for the partilcular table in schema for the day it should reflect the same in test after importing. No matter what the user did in test DB
But I don't want to import the full schema always as itsl taking to much time, so finding something like incremental import where table should import if its find any changes has happened in that else it should skip.
hope you could understand my requirement.
Let me know if you need any further details to help me.
Regards,

Similar Messages

  • Scheduled refresh - Test connections settings validation failed: one or more connections did not pass the test

    I copied a workbook that has a working scheduled refresh
    modified the data model, added some new reports
    added a new source (PQ from azure market place)
    uploaded workbook to new powerbi location.
    get the following error when i try to turn on scheduled refresh - this is even if i just select one data source that is still identical to the original (working) workbook
    Test connections settings validation failed: one or more connections did not pass the test
    Technical Details ▼
    Correlation ID: ddcab6b8-ff2c-4881-9d8e-eeabbd23dff2
    Date and Time: 12/12/2014 06:13:06 AM (UTC)
    what's the problem?
    Jakub @ Adelaide, Australia Blog

    Still a problem and not just for me but other users going by the threads on here...
    https://social.technet.microsoft.com/Forums/en-US/eb8682f4-9b40-456a-a7f1-45627a0f4ff0/cannot-schedule-refresh?forum=powerbiforoffice365#eb8682f4-9b40-456a-a7f1-45627a0f4ff0
    Applies to all worksheets that utilise a PQ connection registered in the gateway. (my gateway is on an azure VM)
    Existing worksheets receive this warning. New workbooks that utilise these connections also receive this warning
    Adding a new PQ to the gateway and a new workbook that uses the new connection also receives this warning.
    Every PQ connection I have receives this warning. My PQ connections are all to web resources. Most are to the workflowmax.com API, and one is to the Azure Marketplace to retrieve data for a date dimension.
    Note: this does not prevent me from scheduling or triggering a refresh, and the refresh itself completes successfully. It's just a weird warning that caused confusion as it sounds like the actual refresh will fail as well seeing as the connection test failed.
    Jakub @ Adelaide, Australia Blog

  • How to synchronize test schema objects with the prod schema objects.

    Hi,
    I have a requirement of synchronizing test schema objects with the production schema objects. Please let me know the below
    1. if there is a standardized method for such activity,
    2. if there are oracle utilities for this task.
    3. If i had to do this job manually, can you let me know the check list if any.
    Thanks
    Purushotham M

    http://www.oracle.com/technetwork/issue-archive/2012/12-sep/o52sqldev-1735911.html
    You could try database diff tool in sql developer(but there are some licence restrictions).
    I don't know your database version, you could try DBMS_COMPARISON package also.
    Look at this link http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_comparison.htm
    Other solution is to create db link between test and production database, and then you can try different types of queries like
    select table_name from user_tables
    minus
    select table_name from user_tables@db_link_to_other_database
    And you can do this for columns, indexes and so on.
    But you must have proper DDL scripts for this, to generate sync script.
    Also there is a question about work process, you are doing sync in reverse order(from production to test). Test db is for test, after test you go to production db with proper ddl and dml scripts, so these schemas shouldn't be different in the first place(talking about schema, not data here).

  • Steps for Refresh the Test Database with cold-backup

    Hello everybody !
    Can any body write me the steps to referesh the test db ( T1 -> already there) from the cold backup of production database ( P1 on another machine ) . all the datafiles are there available in the backup of production ..controlfile or redolog files are not available of production database. Both databases are 9.2.0.8 and are on different machines with same OS of AIX.
    Thanks in advance !

    Steps
    1.) Before Shutting down P1 DB for cold back execute
    alter database backup controlfile to trace;
    2.) copy the created the trace file from udump to test server
    3.) shutdown the P1DB and take cold backup containing all datafiles
    4.) Copy dbf to test servers
    5.) Edit trc file copied in step no 2
    6.) As you are changing the name of db use option SET ( in place of REUSE)
    7.)Also remove unwanted portion
    8.) Change the name of trc file to <somthing>.sql
    9.) Startup test db in nomount stage
    10.) run the above created sql
    11.) tHis will create controlfiles, place db in mount stage
    12.) issue alter database open reset logs
    13) add tempfiles to temp tbsp
    regards
    Pravin

  • Schema refresh.

    I have a database server that supports a number of seperate applications, each application having its own schema and tablespace(s). to refresh the test database from live (or dev from test) we currently use transportable tablespaces to copy the data, and a datapump export/import to take care of objects held in the data dictionary (e.g. packages/procedures).
    Recently we hit a problem as the deveopers have started to use oracle text, when retesting our refresh method we have found that the users stop lists do not get recreated in the database being refreshed. thjis appears to be true of any similar pieces of information held in the CTSSYS schema, as the current usage of this feature is quite limited this isn't a problem at the moment as the stop lists canbe recreated manually what I would like to know is has anyone come accross similar features where a datapump export/import will not transfer the objects/information.
    Chris

    Hi ,
    The main theme of the Initial Load is to Synchronize the Source and Target data.
    Before executing an initial load, disable DDL extraction and replication. DDL processing is controlled by the DDL parameter in the Extract and Replicat parameter files.
    Please refer the below link., The pre-requisites are clearly mentioned in this.
    http://docs.oracle.com/goldengate/1212/gg-winux/GWUAD/wu_initsync.htm
    in this check the sub topic ----> 16.1.2 Prerequisites for Initial Load
    Regards,
    Veera

  • Schema Refresh using exp/imp.

    Hello All,
    I want to perform Schema Refresh of SAMPLE user from producation to Testing envrionment using export/import.
    Cud u plz tell what is the command to perform it ?
    Also Cud anyone plz tell me whether same user(SAMPLE) in Test environment gets dropped before Import done.
    Can i Perform the exp/imp using sys/system or user SAMPLE?

    tvenkatesh07 wrote:
    Hello All,
    I want to perform Schema Refresh of SAMPLE user from producation to Testing envrionment using export/import.
    Cud u plz tell what is the command to perform it ?
    Also Cud anyone plz tell me whether same user(SAMPLE) in Test environment gets dropped before Import done.
    Can i Perform the exp/imp using sys/system or user SAMPLE?If you're runnnig 10g, then use Data Pump and read the documentation:
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#i1007466
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007653
    My Oracle Video Tutorials - http://kamranagayev.wordpress.com/oracle-video-tutorials/

  • Schema Refresh activity

    Hi Gurus,
    I am doing a schema refresh activity from Production db to Test db.
    In PRODUCTION, I have a schema called LMS which has default tablespace as LMS and temporary TEMP.
    In TEST schema is LMSTEST and it;s default tablespace and temporary tablespace is LMSTEST and TEMP respectively.
    But;there is also a tablespace called LMS in TEST database.
    I have dropped the user from TEST
    created a user with same name
    given default and temporary tablespace as LMSTEST and TEMP
    given default roles to that users
    given system privileges as well...
    My poblem is while importing,my data is getting inserted into LMS where as it should go in LMSTEST as i made it default tablespace. Why it has happpened?????
    I have also made quota zero on LMS in TEST database ,but still the data is getting inserted into LMS in tablespace and not in LMSTEST tablespace.....
    Edited by: user13299764 on Aug 4, 2010 2:51 AM

    In export/import utility the data to the target database goes to the same tablespace while doing import as it was in the source database tablespace.In your case LMS tablespace.so follow the below steps.
    1.Either assign LMS as default tablespace to LMSTEST user.
    or
    2.After the import is over move all tables to the LMSTEST tablespace and rebuild the index on LMSTEST tablespace.
    or
    3.Take export using datapump and while import use remap_tablespace option.

  • The build directory of the test run either does not exist or access permission is required.

    i am trying to run automated tests
    i follow the steps in the web but when i have 2 issues that does not solve
    1. i created console project and add build definition without drop folder and queue builds that success.
    2. i created C# unit test and add build definition and run it from VS and it worked and done what i want to do.
    3. i built lab with 1 machine and define the controller on this pc ( the machine in the lab is another pc)
    4. i created test suite with test cases and from vs 2013 i associated automation nethod that i created in 2.
    5. when i try to associate the builds for the test plan i have a PROBLEM(1):
    this happened every new build that i try to assign - all the builds appears after refresh and when i press assign to plan button
    a pop up shown with error : " The build that you selected for this plan no longer exist " but he chose it in build in use but there no work item under it - the list is empty even that there are 2 tests under this test suite.
    and than when i try to run the automated tests i have this PROBLEM(2):
    The build directory of the test run either does not exist or access permission is required. and the test failed.
    Please Help
    Roey

    Hi Rory,
    Thank you for posting in MSDN forum.
    According to the error message:
    (1) a pop up shown with error : " The build that you selected for this plan no longer exist "
    I tried to create a build definition without the drop folder location for unit test solution from the VS IDE, and then I try to select the Build in use option to add the Available builds, then click this Assign to plan. I found that I get same error message
    with yours like the following screen shot.
    However, when we try to create a build definition with drop folder location for this same unit test solution and then build successfully in the VS IDE.
    After you add the Available builds and then click this Assign to plan in the MTM, it work fine.
    Therefore, I assume that the issue is related to that you did not specify a build directory for automated test case. MTM need to know where is the drop location of the build building your tests.
    http://stackoverflow.com/questions/20033217/couldnt-run-my-test-using-microsoft-test-manager
    In addition, I did some research about the problem 2:"The build directory of the test run either does not exist or access permission is required. and the test failed."
    I know that the error message occur in either of the following conditions:
    1. The account under which test controller is running does not have read permission on the build directory. (The build directory is same as the drop location of build associated with this test run.) 
    2. The build directory itself does not exist.
    So please refer the following blog to check this issue:
    http://blogs.msdn.com/b/aseemb/archive/2009/11/25/error-starting-the-test-run-build-directory-of-the-test-run-is-not-specified-or-does-not-exist.aspx
    Therefore, I suggest you can try to specify a drop location for build when you run the automated test from the MTM and then check this issue.
    Best Regards,
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • WCF Published Orch - Could not find a base address that matches scheme http for the endpoint with binding MetadataExchangeHttpBinding

    I have an orchestration published as a web service.  It was working fine on our test environment, we deployed to production, and now getting this error.  The website wouldn't go in the BizTalk MSI, so we copied the files.  We reset the Authentication
    to match the test system (we are using Basic Auth).
    When we try to browse the webservice in the browser, it prompt for userid/password, we enter it, then it gives the following error. 
    I'm not even sure what it means by "base address", if URL was https://prod.mydomain.com/myapp/myservice.svc, would https://prod.mydomain.com be the based address? In the test environment, the URL is https://test.mydomain.com/myapp/myservice.svc. 
    In both environments, we have a customer calling this webservice.
    Also, I don't know what it means "scheme http".  We are using https:... on the URL.
    I'm thinking this is either security related, something to do with the app pool being different, or maybe something to do with bindings. 
    Thanks,
    Neal Walter
    http://MyLifeIsMyMessage.net
    Web.config:
        <services>
          <!-- Note: the service name must match the configuration name for the service implementation. -->
          <service name="Microsoft.BizTalk.Adapter.Wcf.Runtime.BizTalkServiceInstance" behaviorConfiguration="ServiceBehaviorConfiguration">
            <endpoint name="HttpMexEndpoint" address="mex" binding="mexHttpBinding" bindingConfiguration="" contract="IMetadataExchange" />
            <!--<endpoint name="HttpsMexEndpoint" address="mex" binding="mexHttpsBinding" bindingConfiguration="" contract="IMetadataExchange" />-->
          </service>
        </services>
      </system.serviceModel>
        <system.webServer>
            <security>
                <authorization>
                    <remove users="*" roles="" verbs="" />
                    <add accessType="Allow" users="myCustomer" />
                </authorization>
            </security>
        </system.webServer>
    Server Error in '/eSecuritelIn' Application.
    Could not find a base address that matches scheme http for the endpoint with binding MetadataExchangeHttpBinding. Registered
    base address schemes are [https].
    Description:
    An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the
    code. Exception Details: System.InvalidOperationException: Could not find a base address that matches scheme http for the endpoint with binding MetadataExchangeHttpBinding. Registered base address schemes are [https].
    Source Error:
    An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using
    the exception stack trace below.
    Stack Trace:
    [InvalidOperationException: Could not find a base address that matches scheme http for the endpoint with binding MetadataExchangeHttpBinding. Registered base address schemes
    are [https].]
       System.ServiceModel.ServiceHostBase.MakeAbsoluteUri(Uri relativeOrAbsoluteUri, Binding binding, UriSchemeKeyedCollection baseAddresses) +16582113
       System.ServiceModel.Description.ConfigLoader.LoadServiceDescription(ServiceHostBase host, ServiceDescription description, ServiceElement serviceElement, Action`1
    addBaseAddress) +1082
       System.ServiceModel.ServiceHostBase.ApplyConfiguration() +156
       System.ServiceModel.ServiceHostBase.InitializeDescription(UriSchemeKeyedCollection baseAddresses) +215
       System.ServiceModel.ServiceHost..ctor(Object singletonInstance, Uri[] baseAddresses) +400
       Microsoft.BizTalk.Adapter.Wcf.Runtime.WebServiceHost`3..ctor(IsolatedReceiverType isolatedReceiver, BizTalkServiceInstance serviceInstance, Uri[] baseAddresses)
    +36
       Microsoft.BizTalk.Adapter.Wcf.Runtime.WebServiceHostFactory`3.CreateServiceHost(String constructorString, Uri[] baseAddresses) +533
       System.ServiceModel.HostingManager.CreateService(String normalizedVirtualPath) +1413
       System.ServiceModel.HostingManager.ActivateService(String normalizedVirtualPath) +50
       System.ServiceModel.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath) +1172
    [ServiceActivationException: The service '/eSecuritelIn/eSecuritelIn_OrchPublished_RepairEquipmentService.svc' cannot be activated due to an exception during compilation. 
    The exception message is: Could not find a base address that matches scheme http for the endpoint with binding MetadataExchangeHttpBinding. Registered base address schemes are [https]..]
       System.Runtime.AsyncResult.End(IAsyncResult result) +901424
       System.ServiceModel.Activation.HostedHttpRequestAsyncResult.End(IAsyncResult result) +178702
       System.Web.AsyncEventExecutionStep.OnAsyncEventCompletion(IAsyncResult ar) +107
    Version Information: Microsoft .NET Framework Version:4.0.30319; ASP.NET
    Version:4.0.30319.272

    When you want to migrate Web Applications from one environment to other there are multiple ways to achieve it.
    1) Migrate in MSI- Export MSI and select the option for your Web Application, however I don't recommend this option
    2) Is to browse the folder for Web Application(Right click on web app and browse). Copy this folder(normally within inetpub folder) and take it to the inetpub folder of other environment. Later from the IISManager create application.
    Are you not using the same Binding file?
    Check you web.config file and see if the endpoint is configured for mexHTTpBinding.
    Old: binding="mexHttpBinding"
    New: binding="mexHttpsBinding"
    web.config snippet:
    <services>
    <service behaviorConfiguration="ServiceBehavior" name="LIMS.UI.Web.WCFServices.Accessioning.QuickDataEntryService">
    <endpoint behaviorConfiguration="AspNetAjaxBehavior" binding="webHttpBinding" bindingConfiguration="webBinding"
    contract="LIMS.UI.Web.WCFServices.Accessioning.QuickDataEntryService" />
    <endpoint address="mex" binding="mexHttpsBinding" contract="IMetadataExchange" />
    </service>
    Also look into the below article- How to fix: "Could not find a base
    address that matches scheme http for the endpoint with binding WebHttpBinding" Errors
    Moving to https = Could not find a base address that matches scheme
    Thanks,
    Prashant
    Please mark this post accordingly if it answers your query or is helpful.

  • Best practice for the test environment  &  DBA plan Activities    Documents

    Dears,,
    In our company, we made sizing for hardware.
    we have Three environments ( Test/Development , Training , Production ).
    But, the test environment servers less than Production environment servers.
    My question is:
    How to make the best practice for the test environment?
    ( Is there any recommendations from Oracle related to this , any PDF files help me ............ )
    Also please , Can I have a detail document regarding the DBA plan activities?
    I appreciate your help and advise
    Thanks
    Edited by: user4520487 on Mar 3, 2009 11:08 PM

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

  • Schema refresh issue

    Hi Everyone,
                         can anybody do a correction ?  iam having the following query, there's some error in it, after executing schema refresh using export & import , to get count of database objects comparison to be done,
    -- SELECT 'TRUNCATE TABLE '||OWNER||'.'||TABLE_NAME||' ;' FROM DBA_TABLES WHERE OWNER='PRICING' order by  TABLE_NAME;
    SQL> SELECT 'select count(*) from  '||OWNER||'.'||TABLE_NAME||' ;' FROM DBA_TABLES WHERE OWNER='PRICING' order by  TABLE_NAME;
         the output expected  was to display each table name in a schema  following with corresponding number of records to be displayed, but it wasn't showing correctly,

    Hi,
    What Error you get , can you please share the Error log here.
    This is Dynamic Query so it will generate the sql statement  like
    select count(*) from  SCOTT.BONUS ;
    select count(*) from  SCOTT.DEPT ;
    select count(*) from  SCOTT.EMP ;
    You can try something this
    spool count.sql
    SELECT 'select count(*) from  '||OWNER||'.'||TABLE_NAME||' ;' FROM DBA_TABLES WHERE OWNER='PRICING' order by  TABLE_NAME;
    spool off
    @count.sql
    Hope this Help

  • Compare 2 schemas and get the difference with .sql file.

    Hi,
    I am using ORACLE DATABASE 11g R2 and ORACLE Linux 5.
    I want to perform a very lengthy process and want to make it automated.
    I am having a software named as SVN. In which all the developers keep their updated scripts.
    We have 2 schema's one is used for developement and when all the development seems good we implement the scripts on the final schema which is used by testing people also.
    Now every day we need to check the modifications in the development schema, that we do by observing the updated scripts in the SVN software. Now we get the scripts which are modified.
    We will fire this scripts to a sample schema then we will compare the objects in this schema and the same objects in our final schema. If the objects in sample schema is different than the final schema it should give me the
    script to create the same type object in the final schema.
    Below is a proper example to explain :-
    Developement Schema :- 'DEV'
    SVN Script Schema :- 'SVNscript'
    Main Schema :- 'MAIN'
    1) On Monday developers modified/added 4 tables in the development schema names 'DEV' after working for a day they found that the changes are necessary and so checked-in in the SVN software.
    2) On tuesday morning we found 4 scripts which has been modified/added in the SVN.
    The Scripts were as follows :-
    1St table :- An extra column was added.
    2nd table :- An index was created on it.
    3rd table :- 2 Columns were dropped from it.
    4th table :- A new table is added in the schema.
    Now taking these 4 scripts i am going to execute it in a sample schema named :- SVNscript
    SVNscript schema will have now 4 tables with proper properties(columns,indexes,...). Now this is the final table structure as we want in our 'MAIN' schema .
    Coming to MAIN schema which is our most important schema and all the development work is finally posted here.Considering MAIN schema is having around 1000 tables,500 function/procedure/package and many more DB objects.
    I want to compare(one way compare) from SVNscript schema --> MAIN schema :- That the objects present in the SVNscripts schema is same or not in the MAIN schema. If its not same then this code should generate a .sql script for me which i should be able to fire on the MAIN schema.
    The output .sql scripts should be something like this :-
    1st Table :- Alter table add...
    2nd Table :- Create index on table...
    3rd Table :- Alter table drop...
    4th table :- Create table tablename...
    I found a link while trying this but it is not the perfect fit to my requirement
    http://www.dbspecialists.com/files/scripts/compare_schemas.sql
    Please let me know the best code to compare 2 schemas and get a .sql file as output of the difference.
    Thanks in advance.

    Yes I followed the tutorial properly this time. Still i have a few queries :-
    1) This is returning me the 'ALTER TABLE...' statement but how to make it more efficient to return all the DDL dependent on the table, Like index,trigger,view...
    2) I want to compare objects in a different schema on a different db. I have created an DBLINK but how to use it ... can i use it like
    DBMS_METADATA.OPEN('TABLE'," Network_link_name " );Can you please give me some clear about it.
    3) I created a table 'TAB1' with which i want to compare a 'TAB1' in other schema.I was not getting how to compare on remote dblink so,I created the same table with the following code in the same schema
    CREATE TABLE TAB1_OLD as select * from TAB1@DBLINK where 1=2;Now when i fire the below query to get the alter difference I get the following error :-
    SQL> SELECT get_table_alterddl('TAB1','TAB1_old') FROM dual;
    SELECT get_table_alterddl('TAB1','TAB1_old') FROM dual
    ORA-31603: object "TAB1_old" of type TABLE not found in schema "SVNCHECK"
    ORA-06512: at "SYS.DBMS_METADATA", line 5225
    ORA-06512: at "SYS.DBMS_METADATA", line 5189
    ORA-06512: at "SVNCHECK.GET_TABLE_SXML", line 17
    ORA-06512: at "SVNCHECK.COMPARE_TABLE_SXML", line 12
    ORA-06512: at "SVNCHECK.GET_TABLE_ALTERXML", line 11
    ORA-06512: at "SVNCHECK.GET_TABLE_ALTERDDL", line 11
    Can you please guide me why i am getting this error?
    I have made sure that TAB1_OLD table has been created and the entries are also present in the data dictionary tables.
    SQL> select * from tab1_old;
    EMPNO ENAME                  MGR DEPTNO
    ----- -------------------- ----- ------The only difference is that this table was created with CREATE TABLE AS SELECT statement ....
    Thanks.

  • Hi, we need to create the test environment from our production for oracle AP Imaging. we have soa,ipm,ucm and capture managed servers in our weblogic. can anyone tell me what is the best way to clone the environment, can I just tar the weblogic file syste

    Hi, we need to create the test environment from our production for oracle AP Imaging. we have soa,ipm,ucm and capture managed servers in our weblogic..
    Can anyone tell me what is the best way to cloning the application from different environment, the test and production are in different physical server.
    Can I just tar the weblogic file system and untar it to the new server and make the necessary changes?
    Can anyone share their experiences and how to with me?
    Thank in advance.
    Katherine

    Hi Katherine,
    yes and no . You need as well weblogic + soa files as the database schemas (soa_infra, mds...).
    Please refer to the AMIS Blog: https://technology.amis.nl/2011/08/11/clone-your-oracle-fmw-soa-suite-11g/
    HTH
    Borys

  • Error occured during the "test of the interface"

    Hi,everybody
    where I do the "Test Configuration" in the ID(Tools->Test Configuration),an error occured.
    Internal Error         
                HTTP connection to ABAP Runtime failed.
                Error:403ForbiddenURL:http://NEUESDNWXI01:8001/sap/xi/simulation?sap-client=500
                User: PIDIRUSER
    what can i do with the error?
    Thanks

    Hi
    Check if the  service for testing in XI (in transaction SCIF)is  turned on,
    <i>Note :Forebidden is generally an auth or password error. </i>
    Also Check This
    4) Error: HTTP 403 Forbidden
    Description: The server understood the request, but is refusing to fulfill it
    Possible Tips:
    Path sap/xi/engine not active
    • HTTP 403 during cache refresh of the adapter framework - Refer SAP Note -751856
    • Because of Inactive Services in ICF –Go to SICF transaction and activate the services. Refer SAP Note -517484
    • Error in RWB/Message Monitoring- because of J2EE roles – Refer SAP Note -796726
    • Error in SOAP Adapter - "403 Forbidden" from the adapter's servlet. –Because of the URL is incorrect or the adapter is not correctly deployed.
    Regards
    krishna

  • When the WSDL for a UFT test is updated the test becomes Unusable

    Using UFT 11.5,    When I have to update a WSDL that a test is using then the test becomes unusable.  It will allow me to update the WSDL, change and run the test but once I SAVE it and then try to OPEN it later I get the message described below.
    The only way around it is to delete the current WSDL and import a new one.  This is unacceptable because before you can delete a WSDL you have to delete all of your test steps that use it.  Thus, the test must be totally recreated.
    The ERROR when I try to open the test is:  'One or more tests in the solution were not loaded correctly.  For details see the error pane' and the error pane says: '{testname} did not load correctly.  Cannot change 'Shared' schema property after it was already set or the instance.  Please create a new instance'
    And I cannot open a new instance because UFT does not allow more than one instance.
    Help would be greatly appreciated.
    CDance

    That program is probably only reading the 1 in the number 10 - IOW, it isn't capable of reading a 2 digit number in front of the decimal point.
    Contact the developer of that program and let them know that they have some fixin' to do.
    See the PM I am sending to you. See your Inbox at the top of the forum pages.

Maybe you are looking for

  • V2 components not displaying content

    I've got an application that I'm developing that creates and displays several forms from an XML file. The form is created on an MC inside of a ScrollPane by attaching MCs to it. The MCs contain one component each (just TextInputs, TextAreas and Combo

  • I have a line across of blank pixels across my screen

    There is a line of pixels on my screen that goes black when the backlight is on but when the backlight is not on it works just fine. I did not drop it or anything, I just plugged it in to charge and when I went to disconnect it the line of pixels did

  • Background job for zmr21_bdc

    Dear Experts, I put a background job in tcode zmr21_bdc. But it is taking too much time like 1 or 2 days. and then automatically cancelled. can any one tell what might be the problem?n solution for it.

  • Unable to view MMS with PC Suite

    WinXP SP2 I can not view MMS text messages using my PC Suite. with windows explorer I can navigate to my inbox and see all of my text messages but no MMS messages, even though they are stored in the inbox. Any help greatly apreciated.

  • Warehouse items question

    Warehouse Item Records question   When creating a New Warehouse then automatically New Item records are also created. The Item Records created for the New Warehouse are empty Item records based on the Original warehouse initially created when we star