RenewTGT using native cache with KRB5  Message stream modified

I have implemented the Jass tutorial JassAcn.java and it works fine when username and password are entered.
However,when I try to use the ticket from the native cache I get the following error :
     Ticket could not be renewed : Message stream modified (41)     
and then get prompted to enter the username followed by the password. If I enter the user name
and password the authentication is successfull.
I am using a Windows XP SP2 client and JDK 1.5 and a Win2000 server.
The registry key "allowtgtsessionkey" on the client has been set to 0x01 as recommended by
C:\jdk15help\docs\guide\security\jgss\tutorials\Troubleshooting.html
The "Message stream modified" error seems to imply that
checksum used to verify the data packet didn't match what was expected or some packets are being corrupted.
I tried different default_checksum but to no avail.
Has anyone encountered this before?
My config file is as follows:
JaasSample {
com.sun.security.auth.module.Krb5LoginModule required
                    useTicketCache=true
                    renewTGT = true
                    debug=true;
and the debug log
Debug is true storeKey false useTicketCache true useKeyTab false doNotPrompt false ticketCache is null KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Acquire TGT from Cache
Ticket could not be renewed : Message stream modified (41)
Principal is null
null credentials from Ticket Cache
Kerberos username [myname]:
Kerberos password for myname: mypassword
          [Krb5LoginModule] user entered username: myname
principal is [email protected]
Acquire TGT using AS Exchange
EncryptionKey: keyType=3 keyBytes (hex dump)=0000: 38 1A 02 C8 EA 5B EA 67
EncryptionKey: keyType=1 keyBytes (hex dump)=0000: 38 1A 02 C8 EA 5B EA 67
EncryptionKey: keyType=16 keyBytes (hex dump)=0000: D5 F1 D5 DF AD 67 37 5D 3B AB A2 AE 89 1F 13 F1 .....g7];.......
0010: FB EA AB AB D3 52 40 D6
Commit Succeeded

I have implemented the Jass tutorial JassAcn.java and it works fine when username and password are entered.
However,when I try to use the ticket from the native cache I get the following error :
     Ticket could not be renewed : Message stream modified (41)     
and then get prompted to enter the username followed by the password. If I enter the user name
and password the authentication is successfull.
I am using a Windows XP SP2 client and JDK 1.5 and a Win2000 server.
The registry key "allowtgtsessionkey" on the client has been set to 0x01 as recommended by
C:\jdk15help\docs\guide\security\jgss\tutorials\Troubleshooting.html
The "Message stream modified" error seems to imply that
checksum used to verify the data packet didn't match what was expected or some packets are being corrupted.
I tried different default_checksum but to no avail.
Has anyone encountered this before?
My config file is as follows:
JaasSample {
com.sun.security.auth.module.Krb5LoginModule required
                    useTicketCache=true
                    renewTGT = true
                    debug=true;
and the debug log
Debug is true storeKey false useTicketCache true useKeyTab false doNotPrompt false ticketCache is null KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Acquire TGT from Cache
Ticket could not be renewed : Message stream modified (41)
Principal is null
null credentials from Ticket Cache
Kerberos username [myname]:
Kerberos password for myname: mypassword
          [Krb5LoginModule] user entered username: myname
principal is [email protected]
Acquire TGT using AS Exchange
EncryptionKey: keyType=3 keyBytes (hex dump)=0000: 38 1A 02 C8 EA 5B EA 67
EncryptionKey: keyType=1 keyBytes (hex dump)=0000: 38 1A 02 C8 EA 5B EA 67
EncryptionKey: keyType=16 keyBytes (hex dump)=0000: D5 F1 D5 DF AD 67 37 5D 3B AB A2 AE 89 1F 13 F1 .....g7];.......
0010: FB EA AB AB D3 52 40 D6
Commit Succeeded

Similar Messages

  • Manually refreshing TGT leads to "Message stream modified" error

    We wish to use Kerberos to implement application authentication without needing username/password. We have code which gets the TGT and can get other tickets from that, and those tickets can successfully be used with LDAP to make queries. However, the TGT will expire after 10 hours, unless (for example) the lockscreen is used to supply a username/password, at which point the TGT is renewed.
    We understand that it should be possible to renew the TGT from the application, so that long-running processes do not require another login. The ticket's metadata says it is renewable, and we have set options to say we want to do this. We get hold of the ticket and do refresh(), and this attempts to renew the Credentials. However, the attempt fails with the following exception:
    javax.security.auth.RefreshFailedException: Failed to renew Kerberos Ticket for client [email protected] and server krbtgt/[email protected] - Message stream modified (41)
    Having traced the executing code down through the debugger, it goes down through KerberosTicket.refresh(), Credentials.renew(), is constructing a new EncryptedData and does Cipher.getInstance() with transformation "DES/CBC/NoPadding". This gets down to a Provider, which raises the exception.
    We have checked that we only have one matched pair for serviceprincipalname.
    So the questions are: (a) should it be possible to refresh the TGT in this way, or is it simply impossible to refresh it without a username and password?; (b) if it should be possible, what is the likely cause of the exception?
    A debug trace of an attempt looks like:
    H:\support\users\paulw\Workspace\kerberos\bin>java -Dsun.security.krb5.debug=true -Djava.security.krb5.realm=NMS.DEV.PS.GE.COM -Djava.security.krb5.kd
    c=UKCBGDC01DFPS -Djava.security.auth.login.config=jaas.conf JaasAcn
    KinitOptions cache name is C:\Documents and Settings\paulw\krb5cc_paulwAcquire default native Credentials
    Obtained TGT from LSA: Credentials:
    [email protected]
    server=krbtgt/[email protected]
    authTime=20090127152629Z
    startTime=20090127152629Z
    endTime=20090128012629Z
    renewTill=20090203152629Z
    flags: FORWARDABLE;RENEWABLE;INITIAL;PRE-AUTHENT
    EType (int): 23
    Authentication succeeded!
    [email protected]
    Start time = Tue Jan 27 15:26:29 GMT 2009, Expires = Wed Jan 28 01:26:29 GMT 2009, isCurrent = true, isInitial=true, [email protected]
    GE.COM, ServerPrincipal=krbtgt/[email protected]
    Using builtin default etypes for default_tgs_enctypes
    default etypes for default_tgs_enctypes: 3 1 23 16 17.
    CksumType: sun.security.krb5.internal.crypto.RsaMd5CksumType
    EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
    KrbKdcReq send: kdc=UKCBGDC01DFPS UDP:88, timeout=30000, number of retries =3, #bytes=1799
    KDCCommunication: kdc=UKCBGDC01DFPS UDP:88, timeout=30000,Attempt =1, #bytes=1799
    KrbKdcReq send: #bytes read=106
    KrbKdcReq send: #bytes read=106
    KDCRep: init() encoding tag is 126 req type is 13
    KRBError:sTime is Tue Jan 27 16:00:42 GMT 2009 1233072042000
    suSec is 717480
    error code is 52
    error Message is Response too big for UDP, retry with TCP
    realm is NMS.DEV.PS.GE.COM
    sname is krbtgt/NMS.DEV.PS.GE.COM
    msgType is 30
    KrbKdcReq send: kdc=UKCBGDC01DFPS TCP:88, timeout=30000, number of retries =3, #bytes=1799
    DEBUG: TCPClient reading 1783 bytes
    KrbKdcReq send: #bytes read=1783
    KrbKdcReq send: #bytes read=1783
    EType: sun.security.krb5.internal.crypto.ArcFourHmacETypeerr: javax.security.auth.RefreshFailedException: Failed to renew Kerberos Ticket for client [email protected] and server krbtgt/[email protected] - Message stream modified (41)
    The problem is reproducible with a simple java client and has been reproduced on two machines within the corporate network using different kdc's. It is also reproducible on a customers site where the ticket expiry is 1 hour.
    thanks
    paul

    If it helps.. the client code and config file.
    If someone has a working system and could confirm whether the general principle here is correct, that would be much appreciated.
    thanks
    paul
    java -Dsun.security.krb5.debug=true -Djava.security.krb5.realm=<realm-name> -Djava.security.krb5.kdc=<kdc-name> -Djava.security.auth.login.config=jaas.conf JaasAcn
    * @(#)JaasAcn.java
    * Copyright 2001-2002 Sun Microsystems, Inc. All Rights Reserved.
    * Redistribution and use in source and binary forms, with or
    ...snip
    * intended for use in the design, construction, operation or
    * maintenance of any nuclear facility.
    import java.util.*;
    import javax.security.auth.kerberos.*;
    import javax.security.auth.*;
    import javax.security.auth.login.*;
    import com.sun.security.auth.callback.TextCallbackHandler;
    import java.security.*;
    import javax.security.auth.Subject;
    * This JaasAcn application attempts to authenticate a user
    * and reports whether or not the authentication was successful.
    public class JaasAcn {
         public static void main(String[] args) {
              // Obtain a LoginContext, needed for authentication. Tell it
              // to use the LoginModule implementation specified by the
              // entry named "JaasSample" in the JAAS login configuration
              // file and to also use the specified CallbackHandler.
              LoginContext lc = null;
              try {
                   lc = new LoginContext("JaasSample", new TextCallbackHandler());
              } catch (LoginException le) {
                   System.err.println("Cannot create LoginContext. "
                             + le.getMessage());
                   System.exit(-1);
              } catch (SecurityException se) {
                   System.err.println("Cannot create LoginContext. "
                             + se.getMessage());
                   System.exit(-1);
              System.out.println("Time is now " +new Date());
              try {
                   // attempt authentication
                   lc.login();
              } catch (LoginException le) {
                   System.err.println("Authentication failed:");
                   System.err.println("  " + le.getMessage());
                   System.exit(-1);
              System.out.println("Authentication succeeded!");
              Subject mSubject = lc.getSubject();
              Iterator li = mSubject.getPrincipals().iterator();
              // Should only have one Principal
              if ( li.hasNext() ) {
                   Principal lPrincipal = (Principal) li.next();
                   System.out.println(lPrincipal.toString());
              li = mSubject.getPrivateCredentials().iterator();
              if ( li.hasNext() ) {
                   Object lObject = (Object) li.next();
                   if ( lObject instanceof KerberosTicket ) {
                        KerberosTicket lKerberosTicket = (KerberosTicket) lObject;
                        System.out.println(
                                  "Start time=" + lKerberosTicket.getStartTime() +
                                  ", Expires=" + lKerberosTicket.getEndTime() +
                                  ", RenewUntil=" + lKerberosTicket.getRenewTill() +
                                  ", isCurrent=" + lKerberosTicket.isCurrent() +
                                  ", isRenewable=" + lKerberosTicket.isRenewable() +
                                  ", isInitial=" + lKerberosTicket.isInitial() +
                                  ", ClientPrincipal=" + lKerberosTicket.getClient().toString() +
                                  ", ServerPrincipal=" + lKerberosTicket.getServer().toString());            
                        try {
                             lKerberosTicket.refresh();
                        catch ( RefreshFailedException e )
                             System.err.println("err: " + e);
    /** Login Configuration for the JaasAcn and
    ** JaasAzn Applications
       [email protected]
       doNotPrompt=true
       useKeyTab=true
    JaasSample {
       com.sun.security.auth.module.Krb5LoginModule required
       useTicketCache=true
       renewTGT=true
       debug=true;
    };

  • Error from sample JAAS client: Message stream modified (41)

    I am trying to follow the tutorial for JAAS Authentication located here:
    http://java.sun.com/j2se/1.4.2/docs/guide/security/jgss/tutorials/AcnOnly.html
    I am trying to run the sample client JaasAcn.java but am getting a strange error when I try to log on to my Active Directory.
    I am using Java version: jre1.6.0_03
    I can login to Active Directory fine with the credentials I am providing, just not with this client, so I know the credentials are valid.
    Here is the error I get that I don't understand. Any suggestions would be very helpful, if you provide help for this
    The Error message is: [Krb5LoginModule] authentication failed
    Message stream modified (41)
    Here is the full output:
    C:\Progra~1\Java\jre1.6.0_03\bin\java -Dsun.security.krb5.debug=true -Djava.security.krb5.realm=PRSDev.local -Djava.security.krb5.kdc=192.168.40.72 -Djava.security.auth.login.config=jaas.conf JaasAcn
    Debug is true storeKey false useTicketCache false useKeyTab false doNotPrompt f
    alse ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is fa
    lse principal is null tryFirstPass is false useFirstPass is false storePass is f
    alse clearPass is false
    Kerberos username [ILea]: sra
    Kerberos password for sra:
    [Krb5LoginModule] user entered username: sra
    Using builtin default etypes for default_tkt_enctypes
    default etypes for default_tkt_enctypes: 3 1 23 16 17.
    Acquire TGT using AS Exchange
    Using builtin default etypes for default_tkt_enctypes
    default etypes for default_tkt_enctypes: 3 1 23 16 17.
    KrbAsReq calling createMessage
    KrbAsReq in createMessage
    KrbKdcReq send: kdc=192.168.40.72 UDP:88, timeout=30000, number of retries =3, #bytes=144
    KDCCommunication: kdc=192.168.40.72 UDP:88, timeout=30000,Attempt =1, #bytes=144
    KrbKdcReq send: #bytes read=202
    KrbKdcReq send: #bytes read=202
    KDCRep: init() encoding tag is 126 req type is 11
    KRBError:sTime is Mon Dec 31 11:56:40 PST 2007 1199131000000
    suSec is 884978
    error code is 25
    error Message is Additional pre-authentication required
    realm is PRSDev.local
    sname is krbtgt/PRSDev.local
    eData provided.
    msgType is 30
    Pre-Authentication Data:PA-DATA type = 11
    PA-ETYPE-INFO etype = 23
    Pre-Authentication Data:PA-DATA type = 2
    PA-ENC-TIMESTAMP
    Pre-Authentication Data:PA-DATA type = 15
    AcquireTGT: PREAUTH FAILED/REQUIRED, re-send AS-REQ
    Using builtin default etypes for default_tkt_enctypes
    default etypes for default_tkt_enctypes: 3 1 23 16 17.
    Pre-Authentication: Set preferred etype = 23
    KrbAsReq salt is PRSDev.localsraPre-Authenticaton: find key for etype = 23
    AS-REQ: Add PA_ENC_TIMESTAMP now
    EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
    KrbAsReq calling createMessage
    KrbAsReq in createMessage
    KrbKdcReq send: kdc=192.168.40.72 UDP:88, timeout=30000, number of retries =3, #bytes=210
    KDCCommunication: kdc=192.168.40.72 UDP:88, timeout=30000,Attempt =1, #bytes=210
    KrbKdcReq send: #bytes read=1182
    KrbKdcReq send: #bytes read=1182
    EType: sun.security.krb5.internal.crypto.ArcFourHmacEType[Krb5LoginModule] authentication failed
    Message stream modified (41)
    Authentication failed:
    Message stream modified (41)

    FYI I have fixed this problem (and moved on to the next error)
    I disabled the preauthentication requirement on the Active Directory account according to this article:
    http://technet2.microsoft.com/windowsserver/en/library/a0bd7520-ef2d-4de4-b487-e105a9de9e4f1033.mspx?mfr=true

  • How to use NAtive process with WriteUTF()?

    pls provide example NAtive process with WriteUTF() method with example.
    i wanto  call my  .js file
    "#import /D/Sunil/Flex_WorkSpace/YSIPrototype/bin-debug/data/PSAction/FinishLayoutToPhotoshop.js'; &quot;;

    Hi Sunil,
    You'll probably get better answers to this over in the Using Flash Builder forum:
    http://forums.adobe.com/community/flash_builder/using_flash_builder?view=discussions
    -Chris

  • Using system cache with AIR...

    Hi,
    I have been monitoring calls to download media from an AIR iOS app via Charles sniffer. What I am seeing is repeated downloads of the identical files, each and every time they are requested. Obviously, with large data lists, this represents a huge amount of data that shouldn't need to be redownloaded a second time. In this case, I have checked that the cache headers are "no-cache", so I will take that up with the server developers, but I am curious: do image downloads via AIR cache downloaded data using the system's browser cache or is it separate somehow?
    G

    You can cache read-only PL/SQL stored procedures in the DB Cache. I'm not sure about db built-in packages, but if they are read-only, should be ok.
    All DB Cache management functionality is available from DBA Studio. You can also use the supplied dbms_icache PL/SQL package to manage the cache. Refer to the DB Cache Concepts & Admin Guide for details.
    DB Cache is strictly a cache for read-only queries. All updates are passed to the origin db.

  • Use native plugins with Flash Builder 4.5?

    Hello,
    Is it possible to extend Flash Builder 4.5 to have native plugins?
    My use case is: we have developed SDKs for both iOS and Android where we have extensive signal processing implemented in C. On Android we use JNI to access and run the native code. That's done for performance reasons since real time is a requirement.
    Some other tools like Appcelerator Titanium support native modules (e.g. http://wiki.appcelerator.org/display/guides/Module+Developer+Guide+for+iOS)
    Is it possible to do the same with FlashBuilder? Or the only way is to develop in ActionScript? What are the benchmarks for typical video, audio signal processing (e.g. through classical domain example (filtering))?
    Thanks!
    Greg

    Hi,
    Flash Builder 4.5 serial number will not work directy in Flash Builder 4.7. You can use FB 4.5 key in upgrade scenario (upgradation from FB 4.5 to FB 4.7).
    Please use the Flash Builder 4.7 serial number provided at http://www.adobe.com/cfusion/entitlement/index.cfm?e=labs_flashbuilder4-7  for serialing Flash Builder 4.7.
    -Mugdha

  • Using OWB mappings with Oracle CDC/Streams and LCRs

    Hi,
    Has anyone worked with Oracle Streams and OWB? We're looking to leverage Streams to update our data warehouse using Streams to apply changes from the transactional/source DB. At some point we seem to remember hearing that OWB could leverage Streams, perhaps even using the Logical Change Records (LCRs) from Streams as input to mappings?
    Any thoughts much appreciated.
    Thanks,
    Jim Carter

    Hi Jim,
    We've built a fairly complex solution based on streams. We wanted to break up the various components into separate entities so that any network failure or individual component failure wouldn't cause issues for the other components. So, here goes:
    1) The OLTP source database is streaming LCR's to our Datawarehouse where we keep an operational copy of production, updated daily from those streams. This allows for various operational reports to be run/rerun in a given day with the end-of-yesterday picture without impacting the performance on the source system.
    2) Our apply process on the datamart side actually updates TWO copies of data. It does a default apply to our operational copy of production, and each of those tables have triggers that put a second copy of the data into daily partitioned tables. So, yesterday's partitions has only the data that was actually changed yesterday. After the default apply, we walk the Oracle dependency tree to fill in all of the supporting information so that yesterday's partition includes all the data needed to run our ETL queries for that day.
    Example: Suppose yesterday an address for a customer was updated. Streams only knows about the change to the address record, so the automated process would only put that address record into the daily partition. The dependency walk fills in the associated customer, date of birth, etc. data into that partition so that the partition holds all of the related data to that address record for updates without having to query against the complete tables. By the same token, a change to some other customer info will backfill in the adress record for this customer too.
    Now, our ETL queries run against views created against these partitoned tables so that they are only looking at the data for that day (the view s_address joins from our control tables to the partitiond address table so that we are only seeing one day's address records). This means that the ETL is running agains the minimal subset of data required to update dimensions and create facts. It also means that, for example, if there is a problem with the ETL we can suspend running ETL while we fix a problem, and the streaming process will just go on filling partitions until we are ready to re-launch ETL and catch up - one day at a time. We also back up the data mart after each load so that, if we discover an error in ETL logic and need to rebuild we can restore the datamart to a given day and then reprocess the daily partitions in order very simply.
    We have added control fields in those partitioned tables that show which record was inserted/updated/or deleted in production, and which was added by the dependency walk so, if neccessary, our ETL can determine which data elements were the ones that changed. As we do daily updates to the data mart as our finest grain, this process may update a given record in a given partition multiple times so that the status of this record at the end of the day in that daily partition shows the final version of that record for the day. So, for example, if you add an address record an then update it on the same day the partition for that day will show the final updated version of the record, and the control field will show this to be a new inserted record for the day.
    This satisfies our business requirements. Yours may be different.
    We have a set of control tables which manage what partition is being loaded from streams, and which have been loaded via ETL to the datamart. The only limitation is that, of course, the ETL load can only go as far as the last partition completely loaded and closed from streams. And we manage the sizing of this staging system by pruning partitions.
    Now, this process IS complex, and requires a fair chunk of storage, but it provides us with the local daily static copy of the OLTP system for running operational reports against without impacting production, and a guaranteed minimal subset of the OLTP system for speedy ETL runs.
    As for referencing LCRs themselves, we did not go that route due to the dependency issues (one single LTR will almost never include all of the dependant data from which to update a dimension record or build a fact record, so we would have had to constantly link each one with the full data set to get all of that other info).
    Anyway - just thought our approach might give you some ideas as you work out your own approach.
    Cheers,
    Mike

  • Using Parallelism, Cache with Oracle Applications Core Tables

    Hi all,
    I want to know if i can put some tables in parallel and some tables in cache in an Oracle Applications enviroment without any problem. The precedure to change these tables are equal to a single database? Just use command like "ALTER TABLE table_name PARALLEL......"
    Do i need to change anything at Applications Level after these changes that i will made on these tables?
    Tks,
    Paulo

    You can cache read-only PL/SQL stored procedures in the DB Cache. I'm not sure about db built-in packages, but if they are read-only, should be ok.
    All DB Cache management functionality is available from DBA Studio. You can also use the supplied dbms_icache PL/SQL package to manage the cache. Refer to the DB Cache Concepts & Admin Guide for details.
    DB Cache is strictly a cache for read-only queries. All updates are passed to the origin db.

  • I'm trying to use kerberos V5 with ActiveDirectory but get an error

    I'm trying to use kerberos V5 with ActiveDirectory im using simple code from previuos posts but
    when i try with correct username/password i get :
    Authentication attempt failedjavax.security.auth.login.LoginException: Message stream modified (41)
    when i try incorrect username/pass i get :
    Pre-authentication information was invalid (24)
    Debug info is :
    Debug is  true storeKey false useTicketCache false useKeyTab false doNotPrompt false ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
    Kerberos username [naiden]: naiden
    Kerberos password for naiden:      naiden
              [Krb5LoginModule] user entered username: naiden
    Acquire TGT using AS Exchange
              [Krb5LoginModule] authentication failed
    Pre-authentication information was invalid (24)
    Authentication attempt failedjavax.security.auth.login.LoginException: Java code is :
    import javax.naming.*;
    import javax.naming.directory.*;
    import javax.security.auth.login.*;
    import javax.security.auth.Subject;
    import com.sun.security.auth.callback.TextCallbackHandler;
    import java.util.Hashtable;
    * Demonstrates how to create an initial context to an LDAP server
    * using "GSSAPI" SASL authentication (Kerberos v5).
    * Requires J2SE 1.4, or JNDI 1.2 with ldapbp.jar, JAAS, JCE, an RFC 2853
    * compliant implementation of J-GSS and a Kerberos v5 implementation.
    * Jaas.conf
    * racfldap.GssExample {com.sun.security.auth.module.Krb5LoginModule required client=TRUE useTicketCache=true doNotPrompt=true; };
    * 'qop' is a comma separated list of tokens, each of which is one of
    * auth, auth-int, or auth-conf. If none is supplied, the default is 'auth'.
    class KerberosExample {
    public static void main(String[] args) {
    java.util.Properties p = new java.util.Properties(System.getProperties());
    p.setProperty("java.security.krb5.realm", "ISY");
    p.setProperty("java.security.krb5.kdc", "192.168.0.101");
    p.setProperty("java.security.auth.login.config", "C:\\jaas.conf");
    System.setProperties(p);
    // 1. Log in (to Kerberos)
    LoginContext lc = null;
    try {
    lc = new LoginContext("ISY",
    new TextCallbackHandler());
    // Attempt authentication
    lc.login();
    } catch (LoginException le) {
    System.err.println("Authentication attempt failed" + le);
    System.exit(-1);
    // 2. Perform JNDI work as logged in subject
    Subject.doAs(lc.getSubject(), new LDAPAction(args));
    // 3. Perform LDAP Action
    * The application must supply a PrivilegedAction that is to be run
    * inside a Subject.doAs() or Subject.doAsPrivileged().
    class LDAPAction implements java.security.PrivilegedAction {
    private String[] args;
    private static String[] sAttrIDs;
    private static String sUserAccount = new String("Administrator");
    public LDAPAction(String[] origArgs) {
    this.args = (String[])origArgs.clone();
    public Object run() {
    performLDAPOperation(args);
    return null;
    private static void performLDAPOperation(String[] args) {
    // Set up environment for creating initial context
    Hashtable env = new Hashtable(11);
    env.put(Context.INITIAL_CONTEXT_FACTORY,
    "com.sun.jndi.ldap.LdapCtxFactory");
    // Must use fully qualified hostname
    env.put(Context.PROVIDER_URL, "ldap://192.168.0.101:389/DC=isy,DC=local");
    // Request the use of the "GSSAPI" SASL mechanism
    // Authenticate by using already established Kerberos credentials
    env.put(Context.SECURITY_AUTHENTICATION, "GSSAPI");
    env.put("javax.security.sasl.server.authentication", "true");
    try {
    /* Create initial context */
    DirContext ctx = new InitialDirContext(env);
    /* Get the attributes requested */
    Attributes aAnswer =ctx.getAttributes( "CN="+ sUserAccount + ",CN=Users,DC=isy,DC=local");
    NamingEnumeration enumUserInfo = aAnswer.getAll();
    while(enumUserInfo.hasMoreElements()) {
    System.out.println(enumUserInfo.nextElement().toString());
    // Close the context when we're done
    ctx.close();
    } catch (NamingException e) {
    e.printStackTrace();
    }JAAS conf file is :
    ISY {
         com.sun.security.auth.module.Krb5LoginModule required
    debug=true;
    };krb5.ini file is :
    # Kerberos 5 Configuration File
    # All available options are specified in the Kerberos System Administrator's Guide.  Very
    # few are used here.
    # Determines which Kerberos realm a machine should be in, given its domain name.  This is
    # especially important when obtaining AFS tokens - in afsdcell.ini in the Windows directory
    # there should be an entry for your AFS cell name, followed by a list of IP addresses, and,
    # after a # symbol, the name of the server corresponding to each IP address.
    [libdefaults]
         default_realm = ISY
    [domain_realm]
         .isy.local = ISY
         isy.local = ISY
    # Specifies all the server information for each realm.
    #[realms]
         ISY=
              kdc = 192.168.0.101
              admin_server = 192.168.0.101
              default_domain = ISY
         }

    Now it works
    i will try to explain how i do this :
    step 1 )
    fallow this guide http://www.cit.cornell.edu/computer/system/win2000/kerberos/
    and configure AD to use kerberos and to heve Kerberos REALM
    step 2 ) try windows login to the new realm to be sure that it works ADD trusted realm if needed.
    step 3 ) create jaas.conf file for example in c:\
    it looks like this :
    ISY {
         com.sun.security.auth.module.Krb5LoginModule required
    debug=true;
    };step 4)
    ( dont forget to make mappings which are explained in step 1 ) go to Active Directory users make sure from View to check Advanced Features Right click on the user go to mappings in secound tab kerberos mapping add [email protected] for example [email protected]
    step 5)
    copy+paste this code and HIT RUN :)
    import java.util.Hashtable;
    import javax.naming.Context;
    import javax.naming.NamingEnumeration;
    import javax.naming.NamingException;
    import javax.naming.directory.Attributes;
    import javax.naming.directory.DirContext;
    import javax.naming.directory.InitialDirContext;
    import javax.naming.directory.SearchControls;
    import javax.naming.directory.SearchResult;
    import javax.security.auth.Subject;
    import javax.security.auth.login.LoginContext;
    import javax.security.auth.login.LoginException;
    import com.sun.security.auth.callback.TextCallbackHandler;
    public class Main {
        public static void main(String[] args) {
        java.util.Properties p = new java.util.Properties(System.getProperties());
        p.setProperty("java.security.krb5.realm", "ISY.LOCAL");
        p.setProperty("java.security.krb5.kdc", "192.168.0.101");
        p.setProperty("java.security.auth.login.config", "C:\\jaas.conf");
        System.setProperties(p);
        // 1. Log in (to Kerberos)
        LoginContext lc = null;
        try {
                lc = new LoginContext("ISY", new TextCallbackHandler());
        // Attempt authentication
        lc.login();
        } catch (LoginException le) {
        System.err.println("Authentication attempt failed" + le);
        System.exit(-1);
        // 2. Perform JNDI work as logged in subject
        Subject.doAs(lc.getSubject(), new LDAPAction(args));
        // 3. Perform LDAP Action
        * The application must supply a PrivilegedAction that is to be run
        * inside a Subject.doAs() or Subject.doAsPrivileged().
        class LDAPAction implements java.security.PrivilegedAction {
        private String[] args;
        private static String[] sAttrIDs;
        private static String sUserAccount = new String("Administrator");
        public LDAPAction(String[] origArgs) {
        this.args = origArgs.clone();
        public Object run() {
        performLDAPOperation(args);
        return null;
        private static void performLDAPOperation(String[] args) {
        // Set up environment for creating initial context
        Hashtable env = new Hashtable(11);
        env.put(Context.INITIAL_CONTEXT_FACTORY,
        "com.sun.jndi.ldap.LdapCtxFactory");
        // Must use fully qualified hostname
        env.put(Context.PROVIDER_URL, "ldap://192.168.0.101:389");
        // Request the use of the "GSSAPI" SASL mechanism
        // Authenticate by using already established Kerberos credentials
        env.put(Context.SECURITY_AUTHENTICATION, "GSSAPI");
    //    env.put("javax.security.sasl.server.authentication", "true");
        try {
        /* Create initial context */
        DirContext ctx = new InitialDirContext(env);
        /* Get the attributes requested */
        //Create the search controls        
        SearchControls searchCtls = new SearchControls();
        //Specify the attributes to return
        String returnedAtts[]={"sn","givenName","mail"};
        searchCtls.setReturningAttributes(returnedAtts);
        //Specify the search scope
        searchCtls.setSearchScope(SearchControls.SUBTREE_SCOPE);
        //specify the LDAP search filter
        String searchFilter = "(&(objectClass=user)(mail=*))";
        //Specify the Base for the search
        String searchBase = "DC=isy,DC=local";
        //initialize counter to total the results
        int totalResults = 0;
        // Search for objects using the filter
        NamingEnumeration answer = ctx.search(searchBase, searchFilter, searchCtls);
        //Loop through the search results
        while (answer.hasMoreElements()) {
                SearchResult sr = (SearchResult)answer.next();
            totalResults++;
            System.out.println(">>>" + sr.getName());
            // Print out some of the attributes, catch the exception if the attributes have no values
            Attributes attrs = sr.getAttributes();
            if (attrs != null) {
                try {
                System.out.println("   surname: " + attrs.get("sn").get());
                System.out.println("   firstname: " + attrs.get("givenName").get());
                System.out.println("   mail: " + attrs.get("mail").get());
                catch (NullPointerException e)    {
                System.err.println("Error listing attributes: " + e);
        System.out.println("RABOTIII");
            System.out.println("Total results: " + totalResults);
        ctx.close();
        } catch (NamingException e) {
        e.printStackTrace();
    }It will ask for username and password
    type for example : [email protected] for username
    and password : TheSecretPassword
    where ISY.LOCAL is the name of kerberos realm.
    p.s. it is not good idea to use Administrator as login :)
    Edited by: JOKe on Sep 14, 2007 2:23 PM

  • LSMW with IDOC Message type COND_A and Basic type COND_A01

    Hi Sap All.
    in my project we using the LSMW with IDOC Message type COND_A and Basic type COND_A01 and now the requirement is to know the list of the tables which will be updated when i do the LSMW Migration with this IDOC Basic type.
    i have tried to know the list of the tables updated by entering into the transaction we30 and looking at the segments E1KOMG,E1KONH, E1KONP,E1KONM,E1KONW  and i found that the following below are the list of tables which gets updated when i populate the data into IDOC Message type COND_A and Basic type COND_A01.
    KOMG,KONH,KONP,KONM,KONW.
    please correct me if iam wrong.
    regards.
    Varma

    Hi Varma,
    The tables mentioned by you definitely get updated, i guess you can add KONV to the list too, but to be a 100% sure, enable SQL trace and process an IDOC. Then you can look for Insert/Modify/Update statements to get a list of the tables that get updated.
    Regards,
    Chen

  • Using native sql for update

    Hello ,
    I have a reqaust to update a db table declared "outside" our R3 db.
    I mennage to select the data using native sql with a connection to the db.
    Now i need to modify the data on the db.
    Is there a similliar command like "fetch next" ' for update?
    Mybe i need to build a procedure in th "host" db and use its own commands to update?
    Thanks,
    koby

    Hello Kobi,
    Which release of SAP are you woking on?
    If you're on ECC6.0, instead you using Native SQL to call the stored procs of external DBs you can use the [ADBC APIs |http://help.sap.com/abapdocu_702/en/abenadbc_procedure.htm](CL_SQL* classes).
    BR,
    Suhas

  • Performance issues when using AQ notification with one consumer

    We have developed a system to load data from a reservation database to a reporting database
    A a certain point in the proces, a message with the identifier of the reservation is enqueued to a queue (multi-consumer) on the same DB and then propagated to a similar queue on the REP database.
    This queue (multi-consumer) has AQ notification enabled (with one consumer) which calls the queue_callback procedure which
    - dequeues the message
    - calls a procedure to load the Resv data into the Reporting schema (through DB link)
    We need each message to be processed ONLY ONCE thus the usage of one single subscriber (consumer)
    But when load testing our application with multiple threads, the number of records created in the Reservation Database becomes quite large, meaning a large number of messages going through the first queue and propagating to the second queue (very quickly).
    But messages are not processed fast enough by the 2nd queue (notification) which falls behind.
    I would like to keep using notification as processing is automatic (no need to set up dbms_jobs to dequeue etc..) or something similar
    So having read articles, I feel I need to use:
    - multiple subscribers to the 2nd queue where each message is processed only by one subscriber (using a rule : say 10 subscribers S0 to S10 with Si processing messages where last number of the identifier is i )
    problem with this is that there is an attempt to process the message for each subscriber, isn't there
    - a different dequeuing method where many processes are used in parallel , with each message is processed only by one subscriber
    Does anyone have experience and recommendations to make on how to improve throughput of messages?
    Rgds
    Philippe

    Hi, thanks for your interest
    I am working with 10.2.0.4
    My objective is to load a subset of the reservation data from the tables in the first DB (Reservation-OLTP-150 tables)
    to the tables in the second DB (Reporting - about 15 tables at the moment), without affecting performance on the Reservation DB.
    Thus the choice of advanced queueing (asyncronous )
    - I have 2 similar queues in 2 separate databases ( AND Reporting)
    The message payload is the same on both (the identifier of the reservation)
    When a certain event happens on the RESERVATION database, I enqueue a message on the first database
    Propagation moves the same message data to the second queue.
    And there I have notification sending the message to a single consumer, which:
    - calls dequeue
    - and the data load procedure, which load this reservation
    My performance difficulties start at the notification but I will post all the relevant code before notification, in case it has an impact.
    - The 2nd queue was created with a script containing the following (similar script for fisrt queue)
    dbms_aqadm.create_queue_table( queue_table => '&&CQT_QUEUE_TABLE_NAME',
    queue_payload_type => 'RESV_DETAIL',
    comment => 'Report queue table',
    multiple_consumers => TRUE,
    message_grouping => DBMS_AQADM.NONE,
    compatible => '10.0.0',
    sort_list => 'ENQ_TIME',
    primary_instance => '0',
    secondary_instance => '0');
    dbms_aqadm.create_queue (
    queue_name => '&&CRQ_QUEUE_NAME',
    queue_table => '&&CRQ_QUEUE_TABLE_NAME',
    max_retries => 5);
    - ENQUEUING on the first queue (snippet of code)
    o_resv_detail DLEX_AQ_ADMIN.RESV_DETAIL;
    o_resv_detail:= DLEX_AQ_ADMIN.RESV_DETAIL(resvcode, resvhistorysequence);
    DLEX_RESVEVENT_AQ.enqueue_one_message (o_resv_detail);
    where DLEX_RESVEVENT_AQ.enqueue_one_message is :
    PROCEDURE enqueue_one_message (msg IN RESV_DETAIL)
    IS
    enqopt           DBMS_AQ.enqueue_options_t;
    mprop           DBMS_AQ.message_properties_t;
    enq_msgid           dlex_resvevent_aq_admin.msgid_t;
    BEGIN
    DBMS_AQ.enqueue (queue_name => dlex_resvevent_aq_admin.c_resvevent_queue,
    enqueue_options => enqopt,
    message_properties => mprop,
    payload => msg,
    msgid => enq_msgid
    END;
    - PROPAGATION: The message is dequeued from 1st queue and enqueued automatically by AQ propagation into this 2nd queue.
    (using a call to the following 'wrapper' procedure)
    PROCEDURE schedule_propagate (
    src_queue_name IN VARCHAR2,
    destination IN VARCHAR2 DEFAULT NULL
    IS
    sprocname dlex_types.procname_t:= 'dlex_resvevent_aq_admin.schedule_propagate';
    BEGIN
    DBMS_AQADM.SCHEDULE_PROPAGATION(queue_name => src_queue_name,
                                            destination => destination,
    latency => 10);
    EXCEPTION
    WHEN OTHERS
    THEN
    DBMS_OUTPUT.put_line (SQLERRM || ' occurred in ' || sprocname);
    END schedule_propagate;
    - For 'NOTIFICATION': ONE subscriber was created using:
    EXECUTE DLEX_REPORT_AQ_ADMIN.add_subscriber('&&STQ_QUEUE_NAME','&&STQ_SUBSCRIBER',NULL,NULL, NULL);
    this is a wrapper procedure that uses:
    DBMS_AQADM.add_subscriber (queue_name => p_queue_name, subscriber => subscriber_agent );
    Then notification is registered with:
    EXECUTE dlex_report_aq_admin.register_notification_action ('&&AQ_SCHEMA','&&REPORT_QUEUE_NAME','&&REPORT_QUEUE_SUBSCRIBER');
    - job_queue_processes is set to 10
    - The callback procedure is as follows
    CREATE OR REPLACE PROCEDURE DLEX_AQ_ADMIN.queue_callback
    context RAW,
    reginfo SYS.AQ$_REG_INFO,
    descr SYS.AQ$_DESCRIPTOR,
    payload RAW,
    payloadl NUMBER
    IS
    s_procname CONSTANT VARCHAR2 (40) := UPPER ('queue_callback');
    r_dequeue_options DBMS_AQ.DEQUEUE_OPTIONS_T;
    r_message_properties DBMS_AQ.MESSAGE_PROPERTIES_T;
    v_message_handle RAW(16);
    o_payload RESV_DETAIL;
    BEGIN
    r_dequeue_options.msgid := descr.msg_id;
    r_dequeue_options.consumer_name := descr.consumer_name;
    DBMS_AQ.DEQUEUE(
    queue_name => descr.queue_name,
    dequeue_options => r_dequeue_options,
    message_properties => r_message_properties,
    payload => o_payload,
    msgid => v_message_handle
    -- Call procedure to load data from reservation database to Reporting DB through the DB link
    dlex_report.dlex_data_load.load_reservation
    ( in_resvcode => o_payload.resv_code,
    in_resvHistorySequence => o_payload.resv_history_sequence );
    COMMIT;
    END queue_callback;
    - I noticed that messages are not taken out of the 2nd queue,
    I guess I would need to use the REMOVE option to delete messages from the queue?
    Would this be a large source of performance degradation after just a few thousand messages?
    - The data load through the DB may be a little bit intensive but I feel that doing things in parallel would help.
    I would like to understand if Oracle has a way of dequeuing in parallel (with or without the use of notification)
    In the case of multiple subscribers with notification , does 'job_queue_processes' value has an impact on the degree of parallelism? If not what setting has?
    And is there a way supplied by Oracle to set the queue to notify only one subscriber per message?
    Your advice would be very much appreciated
    Philippe
    Edited by: user528100 on Feb 23, 2009 8:14 AM

  • Native B4 with Logic Pro question

    Hi !
    I'm using Native B4 with an M-Audio Keystation 88es.
    My question is... is there any way I can channel a ride cymbal to play along with the B4 in the lower keyboard split? (In other words I have one keyboard. I would be walking lower manual bass in my left hand, but a ride would be playing along with it, similar to a bass/ride split on a piano.)
    Can someone explain this step by step or refer me to the page in the manual?
    Thx !

    The environment is a playpen for this kind of stuff - it's MIDI processing facilities beat anything else out there.
    In the environment, create a new instrument - call it "Split". Assign it to the current track in the arrange page by clicking on it with the "MIDI Thru" tool (or just select it on an arrange track in the usual way).
    This instrument is the instrument that we play. Now, create two new instruments (called "left" and "right"), and cable the "Split" instrument to the "left" one. As second output triangle will appear on "Split", so cable the second output to the "right" instrument. This way, the notes you play get sent to both of these two new instruments.
    "Left" is going to be our left keyboard split. Adjust the low note and high note values to cover the keyboard range you want the organ to cover.
    Similarly, adjust the note range of "Right" so it's lowest note starts at the highest note set on "Left".
    Lastly, cable the output of "Left" to the audio instrument holding your organ, and the output of "Right" to your drum thing. You can adjust the transpose values on either of "Left" and "Right" to play the desired range.
    You've just used the environment to set up a simple keyboard split across two virtual instruments.
    In the same way, you can set up multi-keyboard splits, layers with crossfading, velocity splits and many other cool stuff across multiple instruments.

  • Using PPDS PDS with CTM

    Hi
    I am trying to use CTM with PPDS PDS. When I create the master data structure it is not able to pickup the succesive source of supply.
    i.e. Suppose Product A (in-house production) has a component B (externally procured), the T-lane for B is existing from plant to the vendor. The PDS for A is now added to the MDS, when we now try to generate the MDS, it does not pick up the downstream source of supplies and gives the error "Could not find source of supply for B".
    The BADi for using PPDS PDS with CTM has been implemented already but the problem still exists.
    If I use PPDS PPM i do not get this problem.
    Kindly help.
    Regards
    Manoj

    Hi Keiji
    I saw this message sometime back on this thread then I dont know what happened to it
    Subject: Re: Using PPDS PDS with CTM
    Message: Hi Manoj
    Following comment means, you already implement BADI /SAPAPO/CURTO_CREATE Method CREATE_CTM_PDS ?
    >The BADi for using PPDS PDS with CTM has been implemented already but the problem still exists.
    From your comment, you try to use normal PPDS PDS. But since CTM require CTM PDS for PPDS, if you do not have CTM PDS, please create CTM PDS by using above BADI during CIF transfer of your PDS.
    If above comment does not apply,  do you see any error message in CTM master data check ?
    best regards
    Keiji
    The CTM PDS have been created, thats what i meant by that BADi implementation. The problem is still existing. The CTM master data check says that the source of supply cannot be found. I replicated the same model with PPM and it worked.
    Thanks
    Manoj
    Edited by: Manoj Narain on Jan 5, 2009 3:33 PM

  • Lightroom has crashed with error message 'error reading from its preview cache' - this happens every time i try a launch it. I have looked a forum to repair the fault. there are different recommedations.can you tell me step by step how to recify  this pro

    lightroom has crashed with error message 'error reading from its preview cache' - this happens every time i try a launch it. I have looked a forum to repair the fault. there are different recommedations.can you tell me step by step how to recify  this problem. i can not find ircat ir data etc  in the programme files. Thanks

    robmac76 wrote:
    can you tell me step by step how to recify  this problem.
    Yes:
    Step 1: Delete the preview cache.
    That's it!
    So, the trick is to find the preview cache.
    This document gives location:
    http://helpx.adobe.com/lightroom/kb/preference-file-locations-lightroom-41.html
    To be clear: it's a folder in with your catalog and you want to delete the whole folder but nothing else.
    If your catalog file path is "X:\My Catalogs\My Catalog.lrcat", then the preview cache "folder" will be:
    X:\My Catalogs\My Catalog Previews.lrdata
    So the trick is to find your catalog file, and since Lr won't open your catalog and run, you can't use it to find out.
    But you should be able to find it by searching your disks for *.lrcat files - do you know how to do that?
    Which OS?
    If you know the name of the catalog it will help, but also: it's probably NOT in with your program files.
    UPDATE: The above-mentioned document gives default name and location for preview cache, so if you haven't overridden the default, it should be right where that document said it would be - is it? (of course you need to know your user name too, but presumably you do, no?).
    R

Maybe you are looking for