Failover Data Service

I've registered a service in cluster using following command:
hareg -r sigService -m START_NET="/export/home/pdx/sig/bin/startc" -m STOP_NET="
/export/home/pdx/sig/bin/stopc" -d "oracle" -h "pdxsig" -a 1
But during cluster startup service startsup on both servers active and standby .
How can i make it failover instead of scalable.
Thanks

The START_NET nd STOPa_NET methods are run on both primary and backup nodes for a service. It is upto the scripts for the method to check if the application needs to be started or not.
There is an example in the SC 2.2 API manual on http://docs.sun.com/
try this long url.....
http://docs.sun.com/ab2/coll.83.7/CLUSTAPIPG/@Ab2TocView?Ab2Lang=C&Ab2Enc=iso-8859-1&DwebQuery=cluster&oqt=cluster

Similar Messages

  • Data service ONLYT to retry NOT to failover when service is down

    Hi All,
    I have around 10 services to be clustered under 3.2 with network aware agents; normally I use scdbuilder to create agents.
    There is few services, which I want them to stay down if they fail in the running node.
    Let say I have service1 - running on port 2000, If port 2000 is down, I want advice it to restart this, failing the restart I want to make it START_FAIL, rather than switching to other nodes and try to restart there.
    Is it possible to achieve this thru Sun Cluster 3.2
    Thanks for the time,
    Aruna

    Hi Detlef,
    Many thanks, I was reading about failover mode property - RESTART_ONLY, I guess It should work becoz my data services are build from scdsbuilder, so they should use DSDL. I will test it.
    Thanks again,
    Aruna

  • Setup Cluster using Solaris Container data service

    We have a two-node cluster that we would like to use to create either a zone cluster or use the Solaris Container data service that would create a scalable (or multiple master) data service of two zones, one on each node. We have an app running in the zone, CiscoWorks, that has a local database of jobs that are scheduled to run to configure Cisco switches. I was curious how we setup the storage. If each zone is running on local disks, how do the zones stay in sync and the database updated with job information? Would I setup a device group of the disks where the zones will reside on each node? Can I use SAN as the local disk so the zones can be replicated to a Disaster Recovery location?
    Thanks for any help,
    Chuck

    Chuck,
    Sadly, I think I'm going to make your implementation decisions a lot more complicated because there are three ways you can use zones within Solaris Cluster.
    1. Create a failover zone using the HA Solaris Container Data Service. Here the zone root moves between the cluster nodes as the zone fails over.
    2. Create static zones between with resource groups can migrate. Each zone root is local the the physical node. However, the configuration of the zones can be subtly different.
    3. Create a virtual cluster using static zones within which resource groups can migrate. Each zone root is local the the physical node. However, the configuration of the zones are forced to be the same.
    Note also, that a ZFS zpool can only be mounted on one node or zone at anyone time, although it can be mounted read/write in one zone and read-only in other zones on the same node (IIRC).
    I would be inclined to put your database into an HA configuration, i.e. one that runs on one node at any one time. I would then constrain that in a zone cluster that is bound to a project with restricted resources, i.e. CPUs and memory. Any other tiers of the application, could then be placed either in the global zone (main cluster) or placed in another zone cluster and equally constrained.
    I don't know if that's any help. I can recommend a good book on the subject <shameless plug "Oracle Solaris Cluster Essentials"/>. The example chapters should be of help.
    Regards,
    Tim
    ---

  • Tunneling failed in Data Service 2.5.1

    We created a set of application which all use Data Services
    RTMP using port 2038. In the services-config.xml we set up in the
    my-rtmp channel definition to use rtmpt as the protocol. All
    applications are compiled with this service-config.xml.
    For most users it is working fine, however for some of our
    corporate user, behind restrictive firewalls, a connection to the
    data services can not be made. The application throws a generic
    send failed message.
    Since we are using the rtmpt protocol, what can we do to make
    sure our corporate users are able to use our applications?
    Thanks,
    -Rogier

    It seemed the problem was fixed. However, as we set rtmpt as
    the only channel, our other users were experiencing a big drop off
    in performance due to the tunneling.
    I read a very good blog entry:
    http://www.infoaccelerator.net/blog/post.cfm/setting-up-rtmpt-failover-on-lcds
    where Andrew talks about using rtmpt as failover for those
    users that cannot use rtmp.
    We implemented the structure he described:
    - channel definition for rtmp in services-config.xml
    - channel definition for rtmpt in services-config.xml
    - channel definition for polling-amf in services-config.xml
    - define both channels as default channels in
    data-management-config.xml in the order rtmp, rtmpt, polling-amf
    Everybody was able to use the data services..

  • Cluster of Data Service servers on multiple operating system

    Hi All,
    Does Data Service support deployments in different OS  but in one cluster?
    E.g.
    One server in Windows and one server in Unix but connecting to same repository to form a cluster?
    Thanks,
    Bobby

    YES, jobservers from different OS's can connect to the same repository.
    YES, these jobservers from different OS's can be in the same servergroup (I guess that is what you mean with "cluster") for load balancing & failover.

  • How to pass the data from a input table to RFC data service?

    Hi,
    I am doing a prototype with VC, I'm wondering how VC pass the data from a table view to a backend data service? For example, I have one RFC in the backend system with a tabel type importing parameter, now I want to pass all the data from an input table view to the RFC, I guess it's possible but I don't know how to do it.
    I try to create some events between the input table and data service, but seems there is no a system event can export the whole table to the backend data service.
    Thanks for your answer.

    Thanks for your answer, I tried the solution 2, I create "Submit" button, and ser the mapping scope to  be "All data rows", it only works when I select at least one row, otherwise the data would not be passed.
    Another question is I have serveral imported table parameter, for each table I have one "submit" event, I want these tables to be submitted at the same time, but if I click the submit button in one table toolbar, I can only submit the table data which has a submit button clicked, for other tables, the data is not passed, how can I achieve it?
    Thanks.

  • Error while running a job in Data services

    We are using Data services BO XI R3 tool.
    We have created datastore for Oracle database  8i using Microsoft ODBC DSN. This is the source datastore.
    When job is to pull the data from source to 10g target, we get the below error in log file:
    25219           1        CON-120902     6/26/2009      Data services ODBC Driver Manager unable to find
                                                                              data source <...> in $LINK_DIR/bin/odbc.ini file.
    Is this a UNIX/Windows error?
    Please let us know how this can be resolved.
    Thanks in advance!

    I restart the jobserver and it work, for a moment :P, but now says more...
    1. SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    2. The SQL submitted is <select VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7>
    3. resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.>.
    SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    4. The SQL submitted is <select VERSION from AL_VERSION>.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    5. Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    I check if mysql DB has thas tables (they were created when I setup DS), but i don't know why  this erros says  No database selected  the odbc file looks correct...
    Thank you,
    Edited by: ToloPalmer on Jun 1, 2010 9:51 AM
    Edited by: ToloPalmer on Jun 1, 2010 9:58 AM

  • Error while registering a new repository on Data Services 4.0

    Hi all,
    I've a BO Enterprise Platform 4.0 + Data Services 4.0 installation, and I'm trying to register a new local repository in addition to the installation default one. I followed these steps:
    1) Create a new database schema for the repository
    2) Created the local repository with repository manager
    3) Created a new job server associated with the new repository through the server manager
    4) Tried to register the new repository in the CMC
    I wasn't able to complete the step 4 since i got the following error "unable to connect to profile server".
    Any clue?
    Thank you very much
    Pietro
    Edited by: Pietro Castelli on Jan 30, 2012 12:03 PM

    what is the complete version of DS 4.0 ? also check the REPO_TYPE in AL_VERSION table, for local repo this will be NULL

  • Data Services job rolling back Inserts but not Deletes or Updates

    I have a fairly simple CDC job that I'm trying to put together. My source table has a record type code of "I" for Inserts, "D" for deletes, "UB" for Update Before and "UP" for Update After. I use a Map_CDC_Operation transform to update the destination table based on those codes.
    I am not using the Transaction Control feature (because it just throws an error when I use it)
    My issue is as follows.
    Let's say I have a set of 10,000 Insert records in my source table. Record number 4000 happens to be a duplicate of record number 1. The job will process the records in order starting with record 1 and begin happily inserting records into the destination table. Once it gets to record 4000 however it runs into a duplicate key issue and then my try/catch block catches the error and the dataflow will exit. All records that were inserted prior to the error will be rolled back in the destination.
    But the same is not true for updates or deletes. If I have 10000 deletes and 1 insert in the middle that happens to be an insert of a duplicate key, any deletes processed before the insert will not be rolled back. This is also the case for updates.
    And again, I am not using Transaction Control, so I'm not sure why the Inserts are being rolled back, but more curiously Updates and Deletes are not being rolled back. I'm not sure why there isn't a consistent result regardless of type of operation. Does anyone know what's going on here or  what I'm doing wrong/what my misconception may be?
    Environment information: both source and destination are SQL Server 2008 databases and the Data Services version we use is 14.1.1.460.
    If you require more information, please let me know.

    Hi Michael,
    Thanks for your reply. Here are all the options on my source table:
    My Rows per commit on the table is 10,000.
    Delete data table before loading is not checked.
    Column comparison - Compare by name
    Number of loaders - 1
    Use overflow file - No
    Use input keys - Yes
    Update key columns - No
    Auto correct load - No
    Include in transaction - No
    The rest were set to Not Applicable.
    How can I see the size of the commits for each opcode? If they are in fact different from my Rows per commit (10,000) that may solve my issue.
    I'm new to Data Services so I'm not sure how I would implement my own transaction control logic using a control column and script. Is there a guide somewhere I can follow?
    I can also try using the Auto correct load feature.  I'm guessing "upsert" was a typo for insert? Where is that option?
    Thank you very much!
    Riley

  • RFC connection from SAP Data services to SAP ECC

    We have data services set up in linux machine. I am trying to set up RFC connection between data services and SAP ECC.
    From data management console after creating RFC connection i am getting "RFC_bad_connection" error message. Username , password, system number, hostname and client is working from SAP GUI and has all the authorizations. SAP  gateway and service is name is correct.
    Did anyone had similar issue and what was the solution. Do we have start the RFC connection from linux data services machine, i didn't see any sh executable to do so.
    Thanks in advance for helping to solve the RFC connection issue.
    Edited by: gupta sasha az on Dec 19, 2011 7:24 PM

    I am new to SAP and we have just started work on Data Services.
    I wanted to know the complete procedure as in how can I establish a connection between Data Services and ECC.
    Thanks for your help and time.

  • MS Access 2003 and Data Services 3.0

    I want to access MS Access 2003 from Aqualogic Data Services 3.0. The documentation tells me that Aqualogic Data Services 3.0 supports MS Access 2003
    I read the "Extending Database Support" from the Administration Guide to deploy the xml file ... I did it successful!
    But I can't configure the data source in the console to point to Access2003. How can I do it?

    Thank you Mike for the answer.
    The aqualogic Data Services release 3 support MS Access 2003, as you can verify in "Relational Providers Included With ALDSP" in this manual:
    http://edocs.bea.com/aldsp/docs30/admin/aldsp-wrapper-ext.html#wp1134932
    In the http://localhost:7001/console, when I go to the "Create a New JDBC Data Source", only those DB appears in the Database Type: "adabas for z/OS, CICS/TS for z/OS, Cache, Cloudscape, DB2, DB2 for z/OS, Derby, Enterprise DB, FirstSQL, IMS/DB for z/OS, IMS/TM for z/OS, Informix, Ingres, MS SQL Server, MaxDB, MySQL, Oracle, PointBase, PostgreSQL, Progress, Sybase, VSAM for z/OS , Other".
    But the "MS Access 2003" didn't appears in this menu. The MS-Access.jar file was copied into ALDSP\provider path, and another items was created succesfully as shown in the "Deploying the rational provider" in http://edocs.bea.com/aldsp/docs30/admin/aldsp-wrapper-ext.html#wp1136406
    I wants to create the datasource for MSAccess 2003. How can I do it?

  • Unable to create records in database using PHP Data Service

    Hello, I've been stuck on this for a few days and search up and down for this on the net, no response I've found has worked, so I come to you...
    Here are the steps I've taken, I think it's pretty standard
    1. I have a macbook pro running osx 10.7.3
    2. I installed MAMP all default (I've acually reinstalled this because someone suggested this might fix it)
    3. Thru phpMyAdmin I created a database called my_test
    4. In that database I created a table, this is the export of that table: (I've also tried this with InnoDB which is the default)
    CREATE TABLE `customer` (
      `id` int(10) unsigned NOT NULL AUTO_INCREMENT,
      `name` varchar(50) NOT NULL,
      `email` varchar(150) NOT NULL,
      PRIMARY KEY (`id`),
      UNIQUE KEY `id` (`id`)
    ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=1 ;
    5. I created a new Flx Project (Running Flash Builder 4.5.1 Premium)
         Project Name: PHPTest
         Application Type: Desktop (Although I've done the same thing with Web and got the same results)
         >> Next
         Application Server Type: PHP
         Web Root: /Applications/MAMP/htdocs/
         URL Root: http://localhost:8888/
         Clicked 'Validate Configuration' and that worked
         Output Folder: /Applications/MAMP/htdocs/PHPService (default by Flash Builder)
         >> Finished
    6. On the Data/Services tab on the bottom I click 'Connect to Data/Service...'
         Select 'PHP'
         >> Next
         Select 'Click here to generate a sample'
         Select 'Generate from database' radio button
         Username: root
         Password: root (default for MAMP)
         Host name: localhost
         Server port: 8889 (default for MAMP MySQL port, the HTTP port default is 8888 which in both cases seem to work but every video I've seen that uses MAMP on youtube uses 8889)
         Database: my_test
         Click 'Test Connection' (works)
         Table: customer
         Primary Key: id (this field is greyed out and Flash Builder selects 'id' which it figures out from the SQL table)
         >> Click OK
    7. Then if I don't have the Zend Framework folder in my /Applications/MAMP/htdocs/ folder it tells me its going to create that and I say alright. Then Flash Builder says stuff about how this is really only for testing and not production server ready and I say alright.
    8. Then it takes me back to the Form from Step 6 when I get the chance to select 'Click here to generate a sample'
         These fields are now filled with this data automaticly:
         PHP Class: /Applications/MAMP/htdocs/PHPTest/services/CustomerService.php
         Service name: CustomerService
         Service package: services.customerservice
         Data type package: valueObjects
         >> Click Next (shows all the functions that will be now availible)
         >> Click Finished (End of the forms and it opens Dreamweaver to the php file it created CustomerService.php which I have no need to edit so I close that down)
    9. Back in Flash Builder I switch to design view and drag a datagrid onto the big white area in the middle, whatever that is called.
    10. Then below in Data/Services tab on the bottom I click drag the function 'GetAllCustomer' on top of the datagrid.
         I say yes to rebound and click ok and the view of the datagrid is updated with the colums from the 'customer' table in mySQL.
    (Now let me say that when I hit save and compile this, if I actually had records in this table that I insert thru phpMyAdmin, this does show in the datagrid. So for the whole CRUD thing I am able to get the R which is Read)
    11. Now going back to the design view in Flash Builder I will create a form to create records in the table... (I guess this isn't really a step)
    12. In the Data/Service tab on the bottom I select 'createCustomer' function and then there is an icon called 'Generate Form' that looks like a white piece of paper with a gear on the bottom of it.
    13. This opens up a new form and since I don't have a crazy bunch of fields in my table I just click Finished (If you click next you can specify which fields you want to exclude from the form but this time I don't need to)
    14. This actually creates 2 forms if you look at the code, the second just shows the return type from when you click 'CreateCustomer' button on the first form. Because they overlap in design view I drag the form out of the way so you can see the input form, the return form, and the datagrid
    15. Then I save and compile...
    (Also if your actually reading this I didn't remove the id form field, I get the same result either way, but if you do remove the form field in the code/design you also have to update the button function to not deal with the id before it gets sent off to the php page since in this case the MySQL table is set to auto_increment the id, sorry this doesn't make much sence but this little area doesn't matter much either way)
    16. Now fill in whatever data you want for name and email, try differnt numbers in the id field like 0, nothing, 1, 1000 and click 'CreateCustomer'
    (For me nothing happens, no return is put in the return field, no error pops up and the datagrid is not updated with the new record, also going over to phpMyAdmin and checking out the table browse doesn't show any changes either, I know the button is calling the function because if I add a state change in that function it changes, it would seem that the line:
    createCustomerResult.token = customerService.createCustomer(customer2); doesn't do anything)
    So any idea what is wrong here, I'm convenced it's something stupid easy simple I just can't see it.

    In case you need more info, here is a code dump on a mxml project that has this problem:
    <?xml version="1.0" encoding="utf-8"?>
    <s:WindowedApplication xmlns:fx="http://ns.adobe.com/mxml/2009"
                           xmlns:s="library://ns.adobe.com/flex/spark"
                           xmlns:mx="library://ns.adobe.com/flex/mx"
                           xmlns:customerservice="services.customerservice.*"
                           xmlns:valueObjects="valueObjects.*"
                           currentState="State1">
        <fx:Script>
            <![CDATA[
                import mx.controls.Alert;
                import mx.events.FlexEvent;
                protected function dataGrid_creationCompleteHandler(event:FlexEvent):void
                    getAllCustomerResult.token = customerService.getAllCustomer();
                protected function button_clickHandler(event:MouseEvent):void
                    var customer2:Customer = new Customer();
                    customer2.id = parseInt(idTextInput.text);
                    customer2.name = nameTextInput.text;
                    customer2.email = emailTextInput.text;
                    currentState = "Test";
                    createCustomerResult.token = customerService.createCustomer(customer2);
            ]]>
        </fx:Script>
        <s:states>
            <s:State name="State1"/>
            <s:State name="Test"/>
        </s:states>
        <fx:Declarations>
            <s:CallResponder id="getAllCustomerResult"/>
            <customerservice:CustomerService id="customerService"
                                             fault="Alert.show(event.fault.faultString + '\n' + event.fault.faultDetail)"
                                             showBusyCursor="true"/>
            <valueObjects:Customer id="customer"/>
            <s:CallResponder id="createCustomerResult"/>
            <!-- Place non-visual elements (e.g., services, value objects) here -->
        </fx:Declarations>
        <s:DataGrid id="dataGrid" includeIn="State1" x="330" y="10" width="392"
                    creationComplete="dataGrid_creationCompleteHandler(event)" requestedRowCount="4">
            <s:columns>
                <s:ArrayList>
                    <s:GridColumn dataField="id" headerText="id"></s:GridColumn>
                    <s:GridColumn dataField="name" headerText="name"></s:GridColumn>
                    <s:GridColumn dataField="email" headerText="email"></s:GridColumn>
                </s:ArrayList>
            </s:columns>
            <s:typicalItem>
                <fx:Object id="id1" email="email1" name="name1"></fx:Object>
            </s:typicalItem>
            <s:AsyncListView list="{getAllCustomerResult.lastResult}"/>
        </s:DataGrid>
        <s:Form includeIn="State1" defaultButton="{button}">
            <s:FormItem label="Id">
                <s:TextInput id="idTextInput" text="{customer.id}"/>
            </s:FormItem>
            <s:FormItem label="Name">
                <s:TextInput id="nameTextInput" text="{customer.name}"/>
            </s:FormItem>
            <s:FormItem label="Email">
                <s:TextInput id="emailTextInput" text="{customer.email}"/>
            </s:FormItem>
            <s:Button id="button" label="CreateCustomer" click="button_clickHandler(event)"/>
        </s:Form>
        <s:Form includeIn="State1" x="0" y="204">
            <s:FormItem label="CreateCustomer">
                <s:TextInput id="createCustomerTextInput" text="{createCustomerResult.lastResult as int}"/>
            </s:FormItem>
        </s:Form>
    </s:WindowedApplication>

  • Data Services and Data Quality Recommnded Install process

    Hi Experts,
    I have a few questions. We have some groups that have requested Data Quality be implemented along with another request for Data Services to be implemented. I've seen the requested for Data Services to be installed on the desktop, but from what I've read, it appears to be best to install this on the server side to allow for more of a central benefit to all.
    My questions are:
    1. Can Data Services (Server) install X.1 3.2 be installed on the same server as X.I 3.1 SP3 Enterprise?
    2. Is the Data Services (CLIENT) Version dependent on if the Data Services (Server) install is completed? Basically can the u201CData Services Designeru201D be used without the Server install?
    3. Do we require a new License key for this or can I use the Enterprise Server license key?
    4. At this time we are not using this to move data in and out of SAP, just using this to read data that is coming from SAP.
    From what I read, DATA Services comes with the SAP BusinessObjects Data Integrator or SAP BusinessObjects Data Quality Management solutions. Right now it's seems we dont have a need for the SAP Connection supplement, but definetly something we would implement in the near future. What would be the recommended architecture? A new Server with tomcat and cmc (seperate from our current BOBJ Enterprise servers)? or can DataServices be installed on the same?
    Thank you,
    Teresa

    Hi Teresa.
    Hope you are referring to BOE 3.1 (Business Objects Enterprise) and BODS (Business Objects Data Services) installation on the same server machine.
    Am not an expert on BODS installation.
    But this is my observation :
    We had recently tested on a test machine BOE BOXI 3.1 SP3 (full build) installation before upgrade of our BOE system.
    We also have BODS in our environment.
    Which we also wanted to check whether we could keep on the same server.
    So on this test machine, which already has BOXI 3.1 SP3 build, when i installed BODS server installation,
    what we observed was that,
    all the menus of BOE went away
    and only menus of BODS were seen.
    May be BODS installation overwrites/ or uninstalls BOE, if it already exists ?
    I dont know.  Though i could not fine any documentation, saying that we cannot have BODS and BOE on the same server machine. But this is what we observed.
    So we have kept BODS and BOE on 2 different machines running independently and we do not see any problem.
    Cheers
    indu

  • Unable to Find SBOP DATA SERVICES 4

    Hello ,
    I am unable to find the SBOP DATA SERVICES 4.0 software on SMP, does this mean that my S user ID or my company doesn't have a valid license for this version or the product is in ramp up (I have checked  the ramp up release too but unable to find 4.0 version thr) however I am able to download the XI3.2.
    Regards
    Saurabh Mishra

    Yes, it is in Ramp-up and you need to be authorized to be able to download the software.

  • Unable to Connect Oracle 7 with Data Services 3.2

    Hi All,
    I am incredibly facing an issues with Oracle 7 connecting with Data Services 3.2. with Native client drivers (TNS Names).
    The client legacy application is on Oracle 7, they are moving the data from legacy to SAP ECC. My role is to transfer the data from Oracle 7 to SAP ECC, We are using Data Services 3.2 for conversion. This Data Services 3.2 is not connecting with Oracle 7 by using oracle native drivers (TNS Names).
    I am able to connect Oracle 8 client with SQL *Plus, but with Data Services its not connecting, I am able create Data Store but when try to import the table the error message I am getting like "ORA-24316: Illegal handle Type"
    Is there any other solution for this to connect, or Is Data Services 3.2 will not connect to Oracle 7 since its older version.
    Please reply with your thoughts, or with some solution.
    Appreciate your prompt reply. Many thanks.

    Hi Paul,
    Currently I am using ODBC connection to read the data from legacy Oracle7, but ODBC is very very slow, queries are taking hours to fetch the data. Is there any other solution by chance... I read that DataDirect is the one can connect to any versions of oracle and its bit faster, is that true. Please clarify me. If that is the solution where can I get this DataDirect drivers for oracle 7? Please advice me.
    Thank  You,
    Ashok

Maybe you are looking for