Connector Best Practice?

My organization currently has a Datasync Mobility system and we are looking at options to synchronize calendar information between our GroupWise system and a Sharepoint environment that we will be building. I'm just looking for some best practice information to determine if I should build an additional Datasync server to handle the GroupWise/Sharepoint communication, or if I can just add the Sharepoint connector to Mobility and expect everything to behave. My main concern aside from getting things set up would be that when updates come out, would updating Mobility break the Sharepoint connection, or when a Datasync-nonMobility update comes out would that break Mobility features? Thanks!

marklar23,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Visit http://support.novell.com and search the knowledgebase and/or check all
the other self support options and support programs available.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://forums.novell.com)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://forums.novell.com/faq.php
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://forums.novell.com/

Similar Messages

  • Best Practice for mail connectors

    I am setting up connectors for exchange and groupwise and am testing auto-provisioning. I am running into an issue with groupwise where the GroupWise connector attempts to provision before the E-Directory account is established. Is there a best practice for how to avoid this from happening? should I add a hidden OIM field that populates when E-dir is provisioned correctly and put a rule in place to catch it and then provision groupwise or is there an easier way?
    rkimbal45

    You can put a task on the create user of the email that validates the directory exists. And you can set the retry counter, so that if your email connectors runs first, it will not continue. Then when retry happens, it will see it exists, and then your next tasks can trigger. Just create it as an unconditional task and set it as a preceding to your Create User task. This will keep the create user task in a pending state until your check completes.
    -Kevin

  • Exchange 2007 Multiple Send Connector Cost Best Practice

    Hi,<o:p></o:p>
    I am running exchange 2007 SP2 and have 2x exchange connectors set up with an equal default cost of 1.<o:p></o:p>
    Connector 1 is set to * address space and should forward all emails to two email security/management servers<o:p></o:p>
    Connector 2 is set to a internal sub domain server whose purpose is to file the emails on a CMS.  Emails are marked for "filing" when
    the Outlook plugin adds the connector 2's subdomain address into the bcc field.<o:p></o:p>
    What I want to clarify is if I was to change the cost of Connector 1 to 0 would all mail then only be routed via this connector?<o:p></o:p>
    I presume it would ignore the 2nd connector because the 1st connector's address space is *  and thus no emails would route through connector
    2 and therefore be filed to CMS ?<o:p></o:p>
    What would the best practice be for the costs of each connector ??<o:p></o:p>
    Thanks<o:p></o:p>
    Mat<o:p></o:p>

    I don't believe that 0 is a valid cost for a connector (I get an error if I try to see a connector with that).  If you have two connectors, Exchange will send to the one that matches the address space most correctly.  If you had no second connector,
    connector 1 would attempt delivery to your CMS.  Since you have that second connector, those messages will be delivered by it.
    Connector costs are normally used when you have multiple connectors with the SAME address space, and allows Exchange to fail from one to another when the preferred is not operational for some reason.  For example, if you have an ISDN line as your backup
    Internet connection, you want to give connectors that use it a higher cost than your MPLS connections use.
    I'll add that in most environments I've worked for the past several years, this redirection is handled at the network layer rather than the application layer.

  • OIM Database Connector - ID PK from DB Sequence Best Practice?

    Figured I'd ask around before I hacked something together for something that is a very common scenario. When provisioning my PK (ID) is not set and is set via a Oracle Sequence. I'm currently using the Database Application Tables GTC connector. A couple approaches, based on my limited OIM knowledge:
    1. OnInsert DB trigger to handle. Requires nothing from OIM for this, but I really feel like this is way too invasive and doesn't support the "passive" nature of Identity Management.
    2. Deviate from the GTC and get the SEQUENCE from a custom adapter task.
    3. Go have a beer.
    If there is some documentation or posts regarding this please let me know. I checked the best practices & connector docs and found nothing about this.

    I think you have a good grasp of the problem. Option 3 is of course superior to the others :)
    If you start using pre-pops on GTC generated forms you can't regenerate the GTC or you risk losing the pre pop connectors.
    There has been some discussions around what Oracle supports when it comes to GTC but it seems like you now can add prepop and entity adapters and still stay in support.
    Best regards
    /M

  • Error while Connecting report Best Practices v1.31 with SAP

    Hello experts,
    I'm facing an issue while trying to connect some of my reports from Best Practices for BI with SAP.
    It only happens when it's about info sets, the other ones that are with SAP tables go smoothly without a problem.
    The most interesting is I have already one of the reports connected to SAP info sets.
    I have already verified the document of steps of creation of additional database that comes with BP pack. They seem ok.
    Here goes what Crystal Reports throws to me after changing the data source to SAP:
    For report "GL Statement" one of the Financial Analysis one which uses InfoSet: /KYK/IS_FIGL_I3:
    - Failed to retrieve data from the database; - click ok then...
    - Database connector error: It wasn't indicated any variant for exercise (something like this after translating) - click ok then
    - Database connector error: RFC_INVALID_HANDLE
    For report "Cost Analysis: Planned vs. Actual Order Costs" one of the Financial Analysis one which uses InfoSet: ZBPBI131_INFO_ODVR and ZBPBI131_INFO_COAS; and also the Query CO_OM_OP_20_Q1:
    - Failed to retrieve data from the database; - click ok then...
    - Database connector error: check class for selections raised errors - click ok then
    - Database connector error: RFC_INVALID_HANDLE
    Obs.: Those "Z" infosets are already created in SAP environment.
    The one that works fine is one of the Purchasing Analysis reports:
    - Purchasing Group Analysis -> InfoSet: /KYK/IS_MCE1
    I'm kind of lost to solve this, because I'm not sure if it can be in the SAP JCO or some parameter that was done wrongly in SAP and I have already check possible solutions for both.
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Brazil.

    I re-checked step 3.2.3 - Uploading Crystal User Roles (transaction PFCG) - of the manual where it talks about CRYSTAL_ENTITLEMENT and CRYSTAL_DESIGNER roles, I noticed in the Authorizations tab that the status was saying it hadn't been generated and I had a yellow sign, so then that was what I did (I generated) as it says in the manual.
    Both statuses are now saying "Authorization profile is generated" and the sign is now green on the tab.
    I had another issue in the User tab (it was yellow as Authorizations one before generating)....all I needed to do to change to green was comparing user (User Comparison button).
    After all that, I tried once more to refresh the Crystal report and I still have the error messages being thrown.
    There's one more issue in one of the tabs of PFCG transaction, it is on the Menu one where it is with a red sign, but there's nothing talking about it in the manual. I just have a folder called "Role menu" without anything in it.
    Can it be the reason why I'm facing errors when connecting the report to SAP infoSets? (remember one of my reports which is connected to an infoSet works good)
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Brazil.

  • [XI 3.1] BEST PRACTICE method of Oracle connection for RPTs on Linux

    Business Objects XI (3.1) - SP3.
    Running on Red Hat Enterprise Linux OS.
    7,000+ Crystal Reports 2008 *.rpt objects ONLY (No Universe / No WebI).
    All reports connecting to Oracle 10g databases.
    ==================
    In the past, all of this infrastructure was running on Windows Server OS and providing the database access via a Named ODBC connection (eg. "APP_DATA".)
    This made it easy to manage as all the Report Developers had a standard System DSN called "APP_DATA" which was the same as the System DSN name on all of our DEV, TEST/UAT, and PROD servers for Business Objects.
    When we wanted to move/promote a *.rpt file from DEV to PROD we did not have to change any "Database Connection" info as it was all taken care of by pointing the System DSN called "APP_DATA" a a different physical Oracle server at the ODBC level.
    Now, that hardware is moving from Windows OS to Red Hat Linux and we are trying to determine the Best Practices (and Pros/Cons) of using one of the three methods below to access the Oracle database for our *.rpts....
    1.) Oracle Native connection
    2.) ODBC connection
    3.) JDBC connection
    Here's what we have determined so far -
    1a.) Oracle Native connection should be the most efficient method of passing SQL-query to the DB with the fewest issues and best speed [PRO]
    1b.) Oracle Native connection may not be supported on Linux - http://www.forumtopics.com/busobj/viewtopic.php?t=118770&view=previous&sid=9cca754b468fc67888ab2553c0fbe448 [CON]
    1c.) Using Oracle Native would require special-handling on the *.rpts at either the source-file or the CMC level to change them from DEV -> TEST -> PROD connection. This would result in a lot more Developer / Admin overhead than they are currently used to. [CON]
    2a.) A 3rd-Party Linux ODBC option may be available from EasySoft - http://www.easysoft.com/products/data_access/odbc_oracle_driver/index.html - which would allow us to use a similar Developer / Admin overhead to what we are used to. [PRO]
    2b.) Adding a 3rd-Party Vendor into the mix may lead to support issues is we have problems with results or speeds of our queries. [CON]
    3a.) JDBC appears to be the "defacto standard" when running Oracle SQL queries from Linux. [PRO]
    3b.) There may be issues with results or speeds of our queries when using JDBC. [CON]
    3c.) Using JDBC requires the explicit-IP of the Oracle server to be defined for each connection. This would require special-handling on the *.rpts at either the source-file (and NOT the CMC level) to change them from DEV -> TEST -> PROD connection. This would result in a lot more Developer / Admin overhead than they are currently used to. [CON]
    ==================
    We would appreciate some advice from anyone who has been down this road before.
    What were your Best Practices?
    What can you add to the Pros and Cons listed above?
    How do we find the "sweet spot" between quality/performance/speed of reports and easy-overhead for the Admins and Developers?
    As always, thanks in advance for your comments.

    Hi,
    I just saw this article and I would like to add some infos.
    First you can quite easely reproduce the same way of working with the odbc entries by playing with the oracle name resolution on the server. By changing some files (sqlnet, tnsnames.ora,..) you can define a different oracle server for a specific name that will be the same accross all environments.
    Database name will be resolved differently regarding to the environment and therefore will access a different database.
    Second option is the possibility to change the connection in .rpt files by an automated way like the schedule manager. This tool is a additional web application to deploy that can change the connection settings of rpt reports on thousands of reports in a few clicks. you can find it here :
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80af7965-8bdf-2b10-fa94-bb21833f3db8
    The last option is to do it with a small sdk script, for this purpose, a few lines of codes can change all the reports in a row.
    After some implementations on linux to oracle database I would prefer also the native connection. ODBC and JDBC are deprecated ways to connect to database. You can use DATADIRECT connectors that are quite good but for volumes you will see the difference.

  • Best Practice on Not Exposing your internal FQDN to the outside world

    Exchange server 2010, sits in DMZ, internet facing. The server is currently using the Default Receive Connector. This exposes the internal fqdn to the outside world (ehlo). Since you should not (can't) change the FQDN on your Default Receive connector, what
    is the best practice here?
    The only solution I can see is the following:
    1. Change the Network on the Default Receive Connector to only internal IP addresses.
    2. Create a new Internet Receive Connector port 25 for external IP addresses (not sure what to put in Network tab?) and use my external FQDN for ehlo responses (e.g. mail.domain.com)
    3. What do I pick for Auth and Permissions, TLS and Annoymous only?
    Michael Maxwell

    Yes, it fails PCI testing/compliance. I shouldn't be able to see my internal server and domain. I understand that is the recommendation, but my client doesn't want to host in the cloud or go with a Trend IHMS (trust me I like that better, but its
    not my choice). I have to work with the deck of cards dealt to me. Thanks, just want a solution with what I have now.
    Michael Maxwell
    Understand. I wont go into the value of those tests  :)
    If the customer is really concerned about exposing the internal name, then create a new receive connector with a different FQDN  ( and corresponding cert)  for anonymous connections as you mention above. Know that  it also means internal clients
    can connect to the server on port 25 as well if you dont have the ability to scope to set of ip addresses ( i.e. a SMTP gateway).
    The internal names of the servers will also be in the internet headers of messages sent out:
    http://exchangepedia.com/2008/05/removing-internal-host-names-and-ip-addresses-from-message-headers.html
    http://www.msexchange.org/kbase/ExchangeServerTips/ExchangeServer2007/SecurityMessageHygiene/HowtoremoveinternalservernamesandIPaddressesfromSMTPheaders.html
    Twitter!:
    Please Note: My Posts are provided “AS IS” without warranty of any kind, either expressed or implied.

  • Best Practice Regarding Maintaining Business Views/List of Values

    Hello all,
    I'm still in the learning process of using BOXI to run our Crystal Reports.  I was never familiar with the BO environment before but I have recently learned that every dynamic parameter we create for a report, the Business View/Data Connectors/LOV are created on the Enterprise Repository the moment the Crystal Report is uploaded.
    All of our reports are authored from a SQL Command statement and often times, various reports will use the same field name from the database for different reports.  For example, we have several reports that use the field name "LOCATION" that exists on a good number of tables on the database.
    When looking at the Repository, I've noticed there are several variations of LOCATION, all which I'm assuming belongs to one specific report.  Having said that, I see that it can start to become a nightmare in trying to figure out which variation of LOCATION belongs to what report.  Sooner or later, the Repository will need to be maintained a bit cleaner, and with the rate we author reports, I forsee a huge amount of headache down the road.
    With that being said, what's the best practice in a nutshell when trying to maintain these repository items?  Is it done indirectly on the Crystal Report authoring side where you name your parameter field identifiable to a specific report?  Or is it done directly on the Repository side?
    Thank you.

    Eric, you'll get a faster qualified response if you post to the  Business Objects Enterprise Administration forum as that forum is monitored by qualified support for BOE

  • Best practice for retail store cycle count

    Hi experts
    Would you have any experience/best practice in designing for cycle count process for your clients?
    By cycle count, I mean for frequent counting (daily) for specific item category / exceptional count request.
    My client want to implement a cycle count process to minimize the effort for full stock take which happen around 3 times per year.
    They are using SAP as the core system to keep track of inventory, and using external store system, so some considerations would be on, for example:
    - whether the count variance or counted quantity shall be interfaced back to SAP
    - whether SAP should be the source to initiate a stock count process or POS initiate
    It would be very nice if you can share some best practice as in the process flow, system information flow, etc.
    Best regards
    Dominic

    Hello
    Both can be done but as you say SAP is the core system at least for tracking inventories, better create and integrate inventories is SAP aswell.
    The main advice I would give is to avoid specific tables for inventory integration because standard tables and specific won't be updated well sometimes and support would be needed, stick on standard transactions.
    If the inventory is centralised in SAP, you can for example count them with PDA, send to SAP connectors your counting which will be integrated from a xml file transformed in a Idoc for SAP.
    One parameter is important to consider as the freeze for inventory. You have to decide if you take into account any stock movement that could occur after your inventory creation or not. If you do, the variance after the counting will depend on the initial stock value and also the movement.
    If you don't, the variance will only take into account your initial stock value withotu considering the stock movement.
    Hope it helps you,
    Génia.

  • LDAP configuration for HR Portal in dual stack EHP4 - Best Practice

    Hi Experts,
               Hello Experts,
    We are trying to use the JAVA Stack of ECC server for HR Portal i.e Dual Stack and have applied EHP4 package for ESS/MSS Appraisal. When we are trying to configure the LDAP ADS datasource through portal , we are not able to do it since ABAP datasorce file is available by default.This we are doing for HR(ESS/MSS) Portal.This is for access to the object data stored in the Active Directory.
    We have already checked note 718383.
    Also, for the scenatrio ,LDAP <-> ABAP <-> J2EE
    We have already checked sap help doc.here:
    http://help.sap.com/erp2005_ehp_04/helpdata/EN/e6/0bfa3823e5d841e10000000a11402f/frameset.htm
    What should now be the best practice to follow for configuration ? Should we go for separate Portal server or is it possible to use Java Stack of ECC server for configuration ?
    Also, LDAP <-> ABAP <-> J2EE scenario please suggest if it a best practice and we can follow the same .What are the limitations , risks and issues ? Please suggest if this has been implemented and running well in any live project .
    Are the suggestions applicable for load balanced production servers as well?
    Thanks,
    Rakesh

    Hi,
    the UME datasource must remain ABAP but you can sync the users between ABAP and LDAP using the LDAP connector:
    http://help.sap.com/saphelp_nw70ehp2/helpdata/en/48/74040175bb501ae10000000a42189b/frameset.htm
    Regards,
    Jozsef

  • Best Practices v3.31 - SAP InfoSet Query connection

    Hi,
    I have a problem with adapting a Crystal Report from Best Practices for Business Intelligence v3.31 to SAP system. The report "Cost Analysis Planned vs. Actual Order Costs.rpt" is using SAP InfoSet Query "CO_OM_OP_20_Q1". This InfoSet Query is working fine in SAP. My SAP-User has access to the following user groups:
    - /SREP/IS_UG
    - /KYK/IS_UG
    - ZBPBI131_USR
    The Controlling Area in SAP is '1000'. Crystal Reports generates this error message:
    - Failed to retrieve data from the database.
    - Database Connector Error: "Controlling area does not exist"
    - Database Connector Error: 'RFC_CLOSED'
    But InfoSet Query "CO_OM_CA_20_Q1" has no problem with Controlling Area '1000' in Crystal Reports and is working fine in SAP!
    Can somebody help?
    Thanks in advance.
    Peter

    Hello Peter,
    I'm using Best Practices for BI v1.31 and this one also has this report you are talking about.
    I face the same issue when trying to adapt it to my ERP, but if I run it in SAP GUI, it goes smoothly without any issue.
    Please advise,
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Porto Alegre/Brazil.

  • Problem with crystal reports get from All in One Best Practice package

    Hi,
    I tried to use Financial Statements crystal report download from All in One Best Practice package but encounter the below errors. I followed the guide in Manual Steps for Additional Datasource Creation to set up the additional data source. When i try to preview the report, I tried to set all the value to Null but the error message still appear. I believe this has something to do with the infoset but i not sure how to resolve this. Those report which connect to infoset doesnt seem to work for me.
    Errors:
    Failed to retrive data from database. (then i press OK)
    database connector error: 'no item structure data' (then i press OK)
    database connector error: 'RFC_Closed'
    Please advice
    Thank you

    Hi Afzal,
    I think you misunderstood. The crystal report i am talking about is the 23 crystal report template i get from All in One Best Practice package. All i need to do is follow the "Quick Guide to impletement SAP Best Practices for Business Intelligence V4.31" to make those template works. For those template that connect to Database type: SAP Table, Cluster or Function, i can make those work. The problem i facing now are those template (example: Financial Statements) that connect to Database type: SAP Info sets. The error i receive are stated in my first post.
    Please advice

  • Best Practices for creating reports/Dashboards from BW systems

    HI Gurus,
    Best  Practices of creating BO Dashboards / Xcelsisus from BW systems
    Prasad

    You can use the BICS connector that leverages BW queries directly.  It is listed in the Connection Manager as "SAP NetWeaver BW Connection".  You will need both the ABAP and Java stack and SSO configured between the two.  You will also need to have SAP GUI and BEx installed on the machine you are doing development on.  Note that dashboards using this connection can only be hosted in NW Portal for the time being until the next release of BI 4.x platform.
    Here are some links on getting started with the BICS connector:
    [Building Fast and Efficient Dashboards with BW and Xcelsius|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d0ab8cce-1851-2d10-d5be-b5147a651c58]
    [Requirements for BICS|http://wiki.sdn.sap.com/wiki/display/BOBJ/prerequisitestoXcelsiusandSAPNetWeaverBW+Connection]

  • Best practice for interface

    I would assume with all the different interfaces available in SAP for various modules SAP would ahve published a white paper on how to determine the best interface format for a system and what would be the best practice around it. Can someone share a process or methodology on how to determine the best interfaces practice?

    Hueter, I'm not sure I'm following you. Maybe you could post an example. I don't see how to get access to the front panel controls of an FPGA subvi from the host vi--even wihen configured through the connector pane--without creating complimentary controls/indicators on the main FPGA VI block diagram (which seems to defeat the purpose of the exercise). It seemed like an intriguing idea, but it doesn't seem possible to configure a reference to a fpga subvi while the main fpga vi is running (only one reference to an FPGA is allowed, at least on crios).
    Jarrod, I am not talking about accessing various bitfiles (only one can run at a time, right?). I am talking about creating standard FPGA interface code that can be used on multiple projects without having to reconstruct the VIs based on changes to the FPGA Reference. For example, I have interface code that is used to talk to various sensors through the c-series serial module. Those interface VIs depend on a specific FPGA reference to work. When I create a new project for a crio that has other modules besides the 9870, I will inevitably have a new FPGA VI. There is a subset of the new FPGA VI that is identical to the purely serial one (1 loop/state machine), but other code will be added to handle the other modules. Now the FPGA reference is different, and all my interface VIs are broken when I try to use the new FPGA reference. I have to change the FPGA reference in my interface VIs and then go through an reselect all the IO manually. It's not conducive to reuseable code.
    I guess one possibility is to create one Main FPGA VI that has all the IO I could ever need, and then just use whatever subvi's I want in it. I believe this is essentially what's happening in the scan engine. A microcontroller is created on the FPGA with standard IO, and everything is accessed through the abstracted IO.
    Chris

  • Server 2008 R2 RDS HA Licensing configuration best practices

    Hello
    What is the best practice for setting up and HA licensing environment for RDS?  I'm using a mixture of RDS CALs for my internal/AD users and External Connector license for my external/Internet users. 
    Daddio

    Hi,
    To ensure high availability you want to have a fallback License Server in your environment. The recommended method to configure Terminal Service
    Licensing servers for high availability is to install at least two Terminal Services Licensing servers in Enterprise Mode with available Terminal Services CALs. Each server will then advertise in Active Directory as enterprise license servers with regard to
    the following Lightweight Directory Access Protocol (LDAP) path: //CN=TS-Enterprise-License-Server,CN=site name,CN=sites,CN=configuration-container.
    To get more details on how to setup your License Server environment for redundancy and fallback, go over the "Configuring License Servers for High Availability"
    section in the Windows Server 2003 Terminal Server Licensing whitepaper
    Regards,
    Dollar Wang
    Forum Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support,
    contact [email protected]
    Technology changes life……

Maybe you are looking for

  • How do i know an airport card is installed & not working or not installed?

    My Mac details are: Model Name: Mac Pro Model Identifier: MacPro1,1 Processor Name: Dual-Core Intel Xeon Processor Speed: 2.66 GHz Number Of Processors: 2 Total Number Of Cores: 4 L2 Cache (per processor): 4 MB Memory: 2 GB Bus Speed: 1.33 GHz I've a

  • 3110c 2.5mm to 3.5mm connector

    Is it 2.5 to 3.5 mm connector is in a box with phone or no??? please answer

  • Oracle Patch error

    Hi All, I am applying oracle patch 10.2.0.3 on 10.2.0.1 on SUSE 9. However it prompts to select the "Inventory Directory" and I choose /u01/app/oracle/OraInventory which is already populated on the screen. When I click on next it prompts with an erro

  • I dont have a firefox button (firefox 4 new feature) where all the menu is, where can i find it?

    i just upgraded to firefox 4 in windows xp, i noticed a new feature (firefox button) on the top left that is supposed to contain all the menu items, but i don't have it, the menu bar is just the same.

  • Ora-12526 in DBCA

    Hi, I am using oracle 11g r2. I have dropped a database of "XYZ" sid using DBCA. and re-creating it with same sid using DBCA, but it is throwing ORA-12526 on "Database Creation Option" page at finish button click. Please help me out. Thanks & Regards