Problems integrating Lotus6 with LotusNotesPortlet

Hi,
I've downloaded and installed the latest LotusNotes portlet from the IntegrationSolutions (version_prod_9.0.4.0.0_05062003). Installation and configuration went fine.
Accessing Lotus from the externalApplicationLink works too, either the TestApp (faq-question#9) does.
I've reviewd the hints from message=517957 and checked
* MailServer name does not contain a "-"
* Under Server/Ports/Internet Ports/Web the anonymous value is set to Yes
* SSO is patched to 9.0.2.5.0 (2995671) to use External-IDs in the long form (this includs Bugfix 2425832: External Application always uses get type of authentification)
* Portal is patched with 2665607 to use External-IDs in the long form
Using the Mail-Portlet on my page I got this error
... 4063 Database ckuehnelap/Christian Kuehne!!mail\ckuehne.nsf has not been opened yet NotesException: Database ckuehnelap/Christian Kuehne!!mail\ckuehne.nsf has not been opened yet at lotus.domino.NotesExceptionHelper.read(Unknown Source) ...
This seems not to be the real error, cause I got the same message after opening mail\ckuehne.nsf from the LotusClient.
Anything what is different to the solution described in message=517957 is the Lotus-Server-Release - which in my case is 6.0.3. Not sure - but this might be the problem as well ???.
Any Ideas?
Thanks in advance
Christian

changing my plans, I've installed Domino5 on a seperate maschine - and everything runs as described in the manuals.
not figured out the real reason for now, but it seems to be one of these
* Portlets can't run with Domino6 (they do with Domino5)
* Portlets can't run with Domino-Installation on the same maschine (they do with Domino5 on another computer)
I'll test both during the next days and post the results here.
Regards
Christian

Similar Messages

  • Problem integrating BPEL with OID

    Hey,
    We are setting BPEL up to work with collabsuite mid-tier. When applying the configuration steps in the ContentServices_CustomWorkflows.html provided in the devkit we run into the following problem:
    Change to perform:
    Create the Service-to-Service (S2S) Application Entity for BPEL, as follows:
    Set the CLASSPATH variable:
    CLASSPATH=$ORACLE_HOME/integration/orabpel/system/services/config:
    $ORACLE_HOME/integration/orabpel/system/services/lib/bpm-services.jar:
    $ORACLE_HOME/integration/orabpel/lib/orabpel.jar:$ORACLE_HOME/jlib/repository.jar:
    $ORACLE_HOME/jlib/ldap.jar:$ORACLE_HOME/jlib/ldapjclnt10.jar:
    $ORACLE_HOME/integration/orabpel/lib/bpm-infra.jar:
    $ORACLE_HOME/integration/orabpel/lib/orabpel-common.jar:$CLASSPATH
    Run the following command to create an application entity in Oracle Internet Directory:
    ORACLE_HOME/jdk/bin/java oracle.tip.pc.services.identity.oid.OIDApplicationEntry AppEntity AppSubentity
    Results in the following error trying to run the command:
    Exception in thread "main" java.lang.NoClassDefFoundError: oracle.tip.pc.services.identity.oid.OIDApplicationEntry
    at gnu.gcj.runtime.FirstThread.run() (/usr/lib/libgcj.so.5.0.0)
    at JvThreadRun(java.lang.Thread) (/usr/lib/libgcj.so.5.0.0)
    at JvRunMain(java.lang.Class, byte const, int, byte const, boolean) (/usr/lib/libgcj.so.5.0.0)
    at __gcj_personality_v0 (/home/oracle/product/J2EE_101200/jdk/bin/java.version=1.4.2)
    at __libc_start_main (/lib/tls/libc-2.3.4.so)
    at JvRegisterClasses (/home/oracle/product/J2EE_101200/jdk/bin/java.version=1.4.2)
    Anybody any ideas on how to solve the problem?
    Kind regards and thanks in advance,
    Kristof

    The file WFLDAPB.pls should be used to recreate the package body for WF_LDAP (this file is in the wf/sql directory).

  • Problem integrating JSF with Spring

    Here is faces-config.xml
    <application>
    <el-resolver>org.springframework.web.jsf.el.SpringBeanFacesELResolver</el-resolver>
    </application>
    <managed-bean>
    <managed-bean-name>dateuser</managed-bean-name>
    <managed-bean-class>com.datesite.user.DateUser</managed-bean-class>
    <managed-bean-scope>session</managed-bean-scope>
    <managed-property>
    <property-name>entityManagerDateUser</property-name>
    <property-class>com.datesite.user.EntityManagerDateUser</property-class>
    <value>#{entityManagerDateUser}</value>
    </managed-property>
    </managed-bean>
    </faces-config>
    Here is the deployment exception
    javax.faces.FacesException: javax.faces.FacesException: Error performing conversion of value com.datesite.user.EntityM
    anagerDateUser@380b4f9 of type class $Proxy35 to type class com.datesite.user.EntityManagerDateUser for managed bean d
    ateuser.
    at com.sun.faces.application.ApplicationAssociate.createAndMaybeStoreManagedBeans(ApplicationAssociate.java:53
    7)
    Seems liek it can't figure out which class to instantiate?
    Please help.

    I've used the spring integration but I have not encountered this issue.
    It looks to me like Spring is giving out a proxy implementation that cannot be used in place of the actual class. Some things you might want to try:
    1) Configure Spring to create the bean upon initialization instead of waiting for it to be used.
    2) Remove any aspect-oriented configuration from the spring bean (including transaction support) to see if that makes a difference.
    3) Consider the possibility of a class loading conflict, i.e. was the spring container loaded by a different class loader than the one for the web application?
    Finally I would consider posting this on the forums at springframework.org.

  • Problem in starting Integration service with DAC 11g

    Hi friends,
    Im @ the step of registering integration service and the repository service in DAC 11g. I can start the Repository service well in DAC, but facing issue in starting integration service with DAC, while trying to test connection im getting a message like
    Failure connecting to BIA_IS!
    Im not sure the reason to this problem in DAC. I have also setted the necessary environment variables like INFA_HOME and INFA_DOMAINS_FILE referring the domains.infa file like
    INFA_DOMAINS_FILE = C:\Informatica\9.1.0\domains.infa
    Also checked with the dac_env file which has the below contents
    REM -----------------------------------------------------
    REM
    REM ENVIRONMENT VARIABLES THAT YOU MAY NEED TO SET FOR
    REM  PROPER INFORMATICA 8.x or 9.x HANDSHAKE.
    REM
    REM INFORMATICA_SERVER_LOCATION denotes installation of
    REM Informatica components. Example:
    REM C:\Informatica\PowerCenter9.1
    REM
    REM DOMAINS.INFA_FILE_LOCATION denotes the location
    REM (including name) of domains.infa file
    REM
    REM Please make sure to set correct values for variables
    REM mentioned above
    REM
    REM -----------------------------------------------------
    set INFORMATICA_SERVER_LOCATION="C:\Informatica\9.1.0"
    set DOMAINS_INFA_FILE_LOCATION=C:\Informatica\9.1.0\domains.infa
    set INFA_CMD_STYLE=8
    set PATH=C:\Informatica\9.1.0\server\bin;%PATH%
    set INFA_DOMAINS_FILE=%DOMAINS_INFA_FILE_LOCATION%
    What could be the problem and where to check with the logfile related to the integration service failure in DAC.
    Thanks in advance.
    Regards,
    Saro

    Hi guys,
    The issue is sorted out. The below are the two precautions to be considered.
    *) Make sure of INFA_HOME/Server/bin exist @ the end in the PATH variable.
    *) For each and every change in the PATH variable, it is better to restart the services(both infa and DAC) then and there for the changes to take effect.
    Regards,
    Saro

  • Problem while Integrating workflow with OID.

    Hi I have a problem with integrating workflow with OID.I must execute the wf_ldap.synch_all() but I couldn't find WF_LDAP package body in the database.There is only package specification.How I can restore the wf_ldap package.
    Thanks for help..

    The file WFLDAPB.pls should be used to recreate the package body for WF_LDAP (this file is in the wf/sql directory).

  • Problem in integrating xMII with VC

    hi Experts,
    My xMII server Version is <b>11.5</b>
    I followed this document:
    <b>https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40b92770-db81-2a10-8e91-f747188d8033</b> for integrating xMII with VC.
    Connection URL: http://<myXMIIServerName>/Lighthammer/ALL_TEMPLATES?version=<b>11.5</b>&trace=true
    Driver Class Name: <b>com.sap.xmii.custom.integration.jdbc.SAPxMIIJDBCDriver</b>
    System Alias: <b>XMII_JDBC</b>
    Connection test is <b>successful</b>, after configured System.
    I am getting error while <b>FindData Seach</b> in VC,
    Inputs given are,
    System: XMII_JDBC (this is my system)
    Look for: Look for a table
    the error is, <b>Error-2305 Cannot retrive catalogs and schemas; check the connection to the selected systems Cannot access JDBC metadata </b>
    I think i made mistake in connectionURL, but connection test is successful.
    Please guide me, where i made mistake.
    Points will be Awarded.
    Thanks & Regards,
    Chinnadurai R

    Hi Sam,
    Thanks for your reply... I am also think that i made mistake in comnnection URL ,
    I have created some querys under Samples/Reports,
    I gave like this::
    <b>http://<myXMIIServerName>/Lighthammer/Templates/Samples/Reports?version=11.5&trace=true</b> and  <b>http://<myXMIIServerName>/Lighthammer/Templates/Samples/Reports?</b>
    and then like this
    <b>http://<myXMIIServerName>/Lighthammer/Illuminator/Templates/Samples/Reports?version=11.5&trace=true</b>
    I tried using all the URL  shown above, but it is not listing the Querys in VC .
    Can u guide me how can i form that Connection URL ..
    Thanks & Regards,
    Chinnadurai R

  • Windows Integrated Security with SSRS, Sharepoint 2013 and SSAS over http

    I have the following setup and problem:
    Sharepoint 2013 with SSRS in Sharepoint integrated mode
    SSAS 2012 SP1 with http access (IIS + msmdpump) enabled on the same box as SSAS
    Every component I have tried works fine with this (PerformancePoint, .bism connections, SSIS packages etc.), connecting over http using Kerberos and windows integrated authentication.
    SSRS (.rsds) connections in Sharepoint fail a connection test when using the same http connection string + Windows integrated authentication which works for everything else. The error is: "Unsupported data format: -> Microsoft.ReportingServices.DataExtensions.AdomdTestConnectionException:
    Unsupported data format:"
    SQL server profiler shows that the windows username is reaching the SSAS server is all cases.
    Kerberos delegation is set up for SSAS and is working.
    Switching the .rsds connection to saved credentials (same user as I tried with Windows integrated auth) works fine and SQL server profile logs look the same as the Windows integrated case.
    So, everything seems to work with Kerberos + http apart from SSRS ... any idea welcome. I did read that SSPI is not supported for http connections but then again, there are sites which give examples of exactly such connection strings. I can't find any
    mention of this case or exact problem anywhere ...

    For information, this was fixed by applying the .NET 4.5.1 patch as advised by MS support. Now http connections from integrated mode SSRS work ok.

  • Problem in Connecting with the server

    I have a problem while sitting the e-mail. A message comes saying that the device had a problem in connecting with the server. My BB is Tourch 9860
    Please help me

    Hi gaf4u
    Welcome to BlackBerry Support Forums
    Do you already have any email account integrated with your device ?
    If not then do you have a specific BlackBerry data plan / BIS from your Carier on your account ?
    let us know.
    Click " Like " if you want to Thank someone.
    If Problem Resolves mark the post(s) as " Solution ", so that other can make use of it.

  • Performance degradation in Data Integrator Designer with Multi-user mode

    Post Author: ccastrillon
    CA Forum: Data Integration
    We are developing an information system based on a DataMart populated with data using ETL processes built with Data Integrator. We are three people developing and when we work at the same time we begin to have problem of performance with Designer. Designer seems freeze sometimes and development process becomes painful.
    Where is the problem? accessing repository? Is any known bug?
    Job Server? but it happens even when we don't launch any job. Only building ETL processes manipulating objects (Dataflows, workflows,...etc).
    We would appreciate  any help. Thanks in advance.
    Carlos Castrilló

    Post Author: bhofmans
    CA Forum: Data Integration
    What do you mean with 'working at the same time' ? You need 3 different repositories if you want to work with 3 developers, so there should be no impact at all when working simultaniously...
    -Ben.

  • Integrating Apache with Tomcat?

    Anyone have a guide to integrating Apache with Tomcat using mod_jk?
    I followed some guides online, but they all seem really dated with some obscure references. This seems like it should be on the top of the list....

    Hi Alan.
    I've discovered the exact same problem.... 0.0.0.0 instead of 127.0.0.1
    You've obviously not had any replys here, but di you end up working it out on your own?

  • Integrating OID with BPEL

    I would like to know if anyone has integrated oid with BPEL (standalone version)? Is there any documentation available for the same?

    The only files that I came across & made changes to are is_config, jazn.xml. It would be great if you could tell me if there any other files that needs to be configured? I copied the ldapclnt10.jar & ldap.jar into
    ...\BPEL\integration\orabpel\system\appserver\oc4j\j2ee\home\applib as sugegsted in another discussion forum. But that doesnt seem to solve the problem. Am getting NoClassDef found error now when I try to access the worklist.

  • Any kind of integration experience with Weblogic JMS and Oracle AQ?

    Hi,
    In my company I work with java developers who believe in some kind of "holly" database independence I don't understand and as a result my life as a database developer is hell on earth.. Yesterday we again started to discuss, this time where to log, they believe database is slow and prefer logging to filesystem, after some hours finally I could convince them for some operational and reporting needs to use the database and will do this in an asyncronous way whiich they won't get slow. After all I believe the reason for a database is data, this is the place where data lives, and with the correct desing and implementation logging to database would perform better.
    I love Oracle features, and know that we paid a lot for this software, so today I started investigating this promised solution. And quickly I found AQ and JMS topics in the documentation :)
    After this introduction here is my problem; my company use BEA Weblogic as application server and the java guys want AQ to automatically (but of course with some delay) take their JMS log requests into database tables. Does any one have similar application experience, or any kind of integration experience with Weblogic JMS and Oracle AQ?
    Any comments, references, documentation, sample code, url will be most welcomed :)
    Thank you very much!
    Tonguc
    [email protected]
    References I found upto now;
    Oracle® Streams Advanced Queuing Java API Reference 10g Release 2 (10.2) http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14291/toc.htm
    (Packages; javax.jms & oracle.jms)
    Oracle® Streams Advanced Queuing User's Guide and Reference 10g Release 2 (10.2) http://download-uk.oracle.com/docs/cd/B19306_01/server.102/b14257/toc.htm
    Part IV
    Using Oracle JMS and Oracle Streams AQPart V describes how to use Oracle JMS and Oracle Streams Advanced Queuing (AQ).
    Chapter 11, "Introducing Oracle JMS"
    Chapter 12, "Oracle JMS Basic Operations"
    Chapter 13, "Oracle JMS Point-to-Point"
    Chapter 14, "Oracle JMS Publish/Subscribe"
    Chapter 15, "Oracle JMS Shared Interfaces"
    Chapter 16, "Oracle JMS Types Examples"
    A Sample Code from Otn
    http://www.oracle.com/technology/sample_code/tech/java/web_services/jmsws/NewsQueueEJB.java.html

    I wouldn't go as far to say Oracle AQ is out-dated today. However, it is indeed a proprietary technology that did not found much main-stream adoption in the earlier years after its introduction. The advent of JMS made it somewhat more useful (or should I say intriguing, because more people are trying to tie it together with other J2EE technologies), but the Oracle's JMS wrapper classes in aqapi.jar were not feature complete for a long while, so using it outside Oracle's application server was painful, if not impossible. I do agree that the info at the dev2dev's JMS newsgroup or in this forum is highly fragmented, as neither Oracle nor BEA provides an official solution to integrate AQ with WebLogic, so people like us have to learn the technology through experimentation and in a piecemeal fashion.
    3 years ago I was literally "playing around" - we had a polling mechanism set up to use triggers to write Oracle data changes into an event table, and had a Java-based daemon to scan that table and publish events as JMS messages to the WebLogic JMS server. This continues to work reliably till today, but I was looking for a solution that has few parts - I wanted to hook up my WebLogic MDB directly to AQ as a foreign JMS provider. Although I was able to get it to work (including XA), there were a few hoops I had to jump through, such as decompiling the Oracle AQjms classes to make them bind to the WebLogic JNDI tree.
    One year after that Diptanshu Parui took what I did a giant step further - he extended the Oracle AQjms classes to allow them to be bound to the WebLogic JNDI tree naturally, and he figured out how to use WebLogic JMS messging bridge to re-send single-threaded AQ JMS messages to clustered JMS queues, which allow concurrent message assumption by multiple instances of MDB's. My impression is that he is using that setup in a production environment.
    I am sure you are aware of it but I would like to make it clear - in order to use AQ as a foreign JMS provider to WebLogic-hosted MDB's, you don't need to update your database to Oracle 10g - Oracle 8i is good enough (although I recommend at least 9i Release 2). It is not the database engine, but rather the aqapi.jar JMS wrapper on top of AQ that matters. I do recommend that you use aqapi.jar from Oracle Application Server 10.0.3 or up for better XA support, among other things. Again, you don't have to replace WebLogic with Oracle AS - you only need a single jar file from it and put it in your WebLogic's classpath. However, I don't know what this means from a licensing point of view if you ever go to production - do you have to pay the full price of OracleAS or OC4J just to use the aqapi.jar?
    In the coming days I will test the latest aqapi.jar to see how much progress Oracle has made in terms of making their J2EE products more spec-compliant :-).
    Hope the above gives you a different perspective.
    Eric

  • Problem integrating EMC CX3-20 to Sun T5220 server - Wrong partition?

    Hi All,
    I have a problem integrating the EMC CX3-20 to Sun Sparc Enterprise T5220 server.
    In the EMC, two LUNs were created. 1.6TB and 10G were allocated respectively. They were represented by emcpower0a and emcpower1a in Solaris 10.
    I'm wondering why the format -> partition command shows different information for the two LUNs.
    The partition command on emcpower0a shows the "total disk sector available". But the command on emcpower1a shows "the total disk cylinder available"
    I can't use the the disk emcpower0a in Veritas because slice 2 (s2) was not displayed when the command "vxdisk list" is executed. It show "emcpower0" instead of "emcpower0s2".
    Can someone let me know how to solve this problem? Appreciate your help. Thanks.
    Regards,
    Ryan
    bash-3.00# format
    Searching for disks...done
    AVAILABLE DISK SELECTIONS:
    0. c1t0d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
    /pci@0/pci@0/pci@2/scsi@0/sd@0,0
    1. c1t1d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
    /pci@0/pci@0/pci@2/scsi@0/sd@1,0
    2. c1t2d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
    /pci@0/pci@0/pci@2/scsi@0/sd@2,0
    3. c1t3d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
    /pci@0/pci@0/pci@2/scsi@0/sd@3,0
    4. c2t0d0 <DGC-RAID10-0324 cyl 32766 alt 2 hd 64 sec 10>
    /pci@0/pci@0/pci@8/pci@0/pci@1/SUNW,qlc@0/fp@0,0/ssd@w5006016941e0b4fe,0
    5. c2t0d1 <DGC-RAID 10-0324-1.60TB>
    /pci@0/pci@0/pci@8/pci@0/pci@1/SUNW,qlc@0/fp@0,0/ssd@w5006016941e0b4fe,1
    6. c3t0d0 <DGC-RAID10-0324 cyl 32766 alt 2 hd 64 sec 10>
    /pci@0/pci@0/pci@8/pci@0/pci@9/SUNW,qlc@0/fp@0,0/ssd@w5006016141e0b4fe,0
    7. c3t0d1 <DGC-RAID 10-0324-1.60TB>
    /pci@0/pci@0/pci@8/pci@0/pci@9/SUNW,qlc@0/fp@0,0/ssd@w5006016141e0b4fe,1
    *8. emcpower0a <DGC-RAID 10-0324-1.60TB>*
    */pseudo/emcp@0*
    9. emcpower1a <DGC-RAID10-0324 cyl 32766 alt 2 hd 64 sec 10>
    /pseudo/emcp@1
    *[emcpower0a]*
    partition> print
    Current partition table (original):
    Total disk sectors available: 3433021406 + 16384 (reserved sectors)
    Part Tag Flag First Sector Size Last Sector
    0 root wm 34 128.00MB 262177
    1 swap wu 262178 128.00MB 524321
    2 unassigned wm 0 0 0
    3 unassigned wm 0 0 0
    4 unassigned wm 0 0 0
    5 unassigned wm 0 0 0
    6 usr wm 524322 1.60TB 3433021405
    8 reserved wm 3433021406 8.00MB 3433037789
    *[emcpower1a]*
    partition> print
    Current partition table (original):
    Total disk cylinders available: 32766 + 2 (reserved cylinders)
    Part Tag Flag Cylinders Size Blocks
    0 unassigned wm 0 0 (0/0/0) 0
    1 unassigned wm 0 0 (0/0/0) 0
    2 backup wu 0 - 32765 10.00GB (32766/0/0) 20970240
    3 - wu 1 - 4 1.25MB (4/0/0) 2560
    4 - wu 5 - 32765 10.00GB (32761/0/0) 20967040
    5 unassigned wm 0 0 (0/0/0) 0
    6 unassigned wm 0 0 (0/0/0) 0
    7 unassigned wm 0 0 (0/0/0) 0
    bash-3.00# vxdisk list
    DEVICE TYPE DISK GROUP STATUS
    c1t0d0s2 auto:sliced rootdisk rootdg online
    c1t1d0s2 auto:sliced rootmirror rootdg online
    c1t2d0s2 auto:none - - online invalid
    c1t3d0s2 auto:none - - online invalid
    emcpower0    auto:none       -            -            online invalid
    emcpower1s2 auto:sliced ora1dgemcpower1 ora1dg online

    This is related to the size of the 1.6 TB device. The different disk label you are getting is the newer EFI disk label, which Solaris requires on all devices larger than 1 TB. SunSolve doc 216611 explains the differences between the older VTOC label and the EFI label, and links to the VxVM 4.1 Release Notes for Solaris:
    http://seer.entsupport.symantec.com/docs/275690.htm
    Veritas began support for EFI-labelled devices with VxVM 4.1, for Solaris 10 only, but you must be on Sol 10 since you're on a 5220.

  • Integrating cFolders with Portal-KM using WebDAV manager

    Hello All,
    Requirement: Integrating cFolders with Portal-KM using WebDAV manager
    Settings done :
    1) cProject WebDAV url is working through WebDAV client(software name : anyclient)
    The URL is http://<host name>:<port>/cfol450/dpr
    cFolders url is not working thriugh WebDAV client. Basis investigating on that.
    2) Created HTTP system for "cProject " with URL. http://<host name>:<port>
    3) Created WebDAV repository with system path /cfol450/dpr
    4) Created system in portal for user mapping and mapping is done.
    Problem Statement:
    The integrated repository is not visible in KM content. When I go to Monitoring->KM->Component Monitoring->Repository Name,
    the servers shows "not started" as status.
    Troubleshoot Try: I tried changing log level of slimrequest to debug.. but there is no entry in log file to guide on error
    Help appreciated.
    Regards,
    Amol Ghodekar.

    Hello All,
    Requirement: Integrating cFolders with Portal-KM using WebDAV manager
    Settings done :
    1) cProject WebDAV url is working through WebDAV client(software name : anyclient)
    The URL is http://<host name>:<port>/cfol450/dpr
    cFolders url is not working thriugh WebDAV client. Basis investigating on that.
    2) Created HTTP system for "cProject " with URL. http://<host name>:<port>
    3) Created WebDAV repository with system path /cfol450/dpr
    4) Created system in portal for user mapping and mapping is done.
    Problem Statement:
    The integrated repository is not visible in KM content. When I go to Monitoring->KM->Component Monitoring->Repository Name,
    the servers shows "not started" as status.
    Troubleshoot Try: I tried changing log level of slimrequest to debug.. but there is no entry in log file to guide on error
    Help appreciated.
    Regards,
    Amol Ghodekar.

  • Problems using iCloud with Mountain Lion on iMac8,1

    problems using iCloud with Mountain Lion on iMac8,1 - about 5J old - iMac gets slower and slower - total free memory is used in some minutes - no more reaction on input

    Download > http://codykrieger.com/gfxCardStatus Open it and select Integrated Only. It's a bug with NVIDIA graphic cards

Maybe you are looking for

  • Imovie won't open (quits unexpectedly)

    I just upgraded operating systems from panther to leopard, bought imovie08 and installed it, but imovie will not open up. It says "the application iMovie quit unexpectedly" everytime and I just bought it and installed and cleaned out my harddrive and

  • Pressing the enter key on the keyboard

    Need to know the trigger or the way to attribute code to pressing de enter key on the Keyboard!!!??? Thanks!!!!!

  • To make the "req.delivery date" field non-mandatory in VA01 Transaction

    Hi All, I want to make "req. delivery date"(RV45A-KETDAT) field non mandatory in VA01 transaction. I have already checked on screen 4440 .This field is not mentioned as required field. Please help me to find where can i make this field non-mandatory.

  • UDC - TeraData NUMC mapping...

    Greetings! We are trying to extact data from TeraData using UDC. Getting an error mapping a CHAR field in TeraData to NUMC in BW. Changing the TeraData field to Integer is truncating the leading zeros. Any thougths on this mapping without losing lead

  • Enhancement request

    I appreciate what you guys are doing with subqueries. It would be a fantastic feature that would really set Kodo apart, if only it worked better, so I hope you will continue to improve upon it. In that vein, I have an enhancement suggestion. I would