How to avoid performance problems in PL/SQL?

How to avoid performance problems in PL/SQL?
As per my knowledge, below some points to avoid performance proble in PL/SQL.
Is there other point to avoid performance problems?
1. Use FORALL instead of FOR, and use BULK COLLECT to avoid looping many times.
2. EXECUTE IMMEDIATE is faster than DBMS_SQL
3. Use NOCOPY for OUT and IN OUT if the original value need not be retained. Overhead of keeping a copy of OUT is avoided.

Susil Kumar Nagarajan wrote:
1. Group no of functions or procedures into a PACKAGEPutting related functions and procedures into packages is useful from a code organization standpoint. It has nothing whatsoever to do with performance.
2. Good to use collections in place of cursors that do DML operations on large set of recordsBut using SQL is more efficient than using PL/SQL with bulk collects.
4. Optimize SQL statements if they need to
-> Avoid using IN, NOT IN conditions or those cause full table scans in queriesThat is not true.
-> See to queries they use Indexes properly , sometimes Leading index column is missed out that cause performance overheadAssuming "properly" implies that it is entirely possible that a table scan is more efficient than using an index.
5. use Oracle HINTS if query can't be further tuned and hints can considerably help youHints should be used only as a last resort. It is almost certainly the case that if you can use a hint that forces a particular plan to improve performance that there is some problem in the underlying statistics that should be fixed in order to resolve issues with many queries rather than just the one you're looking at.
Justin

Similar Messages

  • What are the best coding technics which will avoid performance problems

    Hi Experts
    What are the best coding technics which are avoiding memory problems and performance problems
    Some times one of few reports are taking too much time to executing while handling large data.
    1.What are the best way to declare a internal table to avoid performance problems.
    2.what is the best way to process data.
    3.what is the best way to clear memory.
    Can you guys help me give some good suggestions for  writing better programs  to avoid performance problems.
    Thanks
    Sailu

    Hi,
    Check this link..[Please Read before Posting in the Performance and Tuning Forum Updated |Please Read before Posting in the Performance and Tuning Forum;
    Which will be the first thread in the Performance and Tuning Forum.
    Search the SCN first.

  • CONNECT BY, Performance problems with hierarchical SQL

    Hello,
    I have a performance problem with the following SQL:
    table name: testtable
    columns: colA, colB, colC, colD, colE, colF
    Following hierarchical SQL:
    SELECT colA||colB||colC AS A1, colD||colE||colF AS B1, colA, colB, colC, level
    FROM testable
    CONNECT BY PRIOR A1 = B1
    START WITH A1 = 'aaa||bbbb||cccc'
    With big tables the performance of this construct is very bad. I can't use functional based indexes to create an Index on "colA||colB||colC" and "colD||colE||colF"
    Has anyone an idea how I can get a real better performance for this hierarchical construct, if I have to combine multiple colums? Or is there any better way (with an PL/SQL or view trigger) solve something like this ??
    Thanks in advance for your investigation :)
    Carmen

    Why not
    CONNECT BY PRIOR colA = colD
    and PRIOR colB = colE
    and ...
    ? It is not the same thing, but I suspect my version is correct:-)

  • How can avoid the  problem of Parameter Prompting when I submitting ?

    I am developing web application in visual studio 2008 in csharp.How can avoid the issue or problem of  Parameter Prompting when I send parameters programaticaly or dyanmicaly?  I am sending the values from .net web form to crystal report but it is still asking for parameters. so when i submit second time that is when the reports is being genereated. How can i solve this problem. Please help. The code Iam using is below.
       1. using System; 
       2. using System.Collections; 
       3. using System.Configuration; 
       4. using System.Data; 
       5. using System.Linq; 
       6. using System.Web; 
       7. using System.Web.Security; 
       8. using System.Web.UI; 
       9. using System.Web.UI.HtmlControls; 
      10. using System.Web.UI.WebControls; 
      11. using System.Web.UI.WebControls.WebParts; 
      12. using System.Xml.Linq; 
      13. using System.Data.OleDb; 
      14. using System.Data.OracleClient; 
      15. using CrystalDecisions.Shared; 
      16. using CrystalDecisions.CrystalReports.Engine; 
      17. using CrystalDecisions.Web; 
      18.  
      19.  
      20. public partial class OracleReport : System.Web.UI.Page 
      21. { 
      22.     CrystalReportViewer crViewer = new CrystalReportViewer(); 
      23.     //CrystalReportSource crsource = new CrystalReportSource(); 
      24.     int nItemId; 
      25.  
      26.     protected void Page_Load(object sender, EventArgs e) 
      27.     { 
      28.         //Database Connection 
      29.         ConnectionInfo ConnInfo = new ConnectionInfo(); 
      30.         { 
      31.             ConnInfo.ServerName = "127.0.0.1"; 
      32.             ConnInfo.DatabaseName = "Xcodf"; 
      33.             ConnInfo.UserID = "HR777"; 
      34.             ConnInfo.Password = "zghshshs"; 
      35.         } 
      36.         // For Each  Logon  parameters 
      37.         foreach (TableLogOnInfo cnInfo in this.CrystalReportViewer1.LogOnInfo) 
      38.         { 
      39.             cnInfo.ConnectionInfo = ConnInfo; 
      40.  
      41.         } 
      42.  
      43.  
      44.  
      45.  
      46.  
      47.  
      48.         //Declaring varibles 
      49.          nItemId = int.Parse(Request.QueryString.Get("ItemId")); 
      50.         //string strStartDate = Request.QueryString.Get("StartDate"); 
      51.         //int nItemId = 20; 
      52.         string strStartDate = "23-JUL-2010"; 
      53.  
      54.         // object declration 
      55.         CrystalDecisions.CrystalReports.Engine.Database crDatabase; 
      56.         CrystalDecisions.CrystalReports.Engine.Table crTable; 
      57.  
      58.  
      59.         TableLogOnInfo dbConn = new TableLogOnInfo(); 
      60.  
      61.         // new report document object 
      62.         ReportDocument oRpt = new ReportDocument(); 
      63.  
      64.         // loading the ItemReport in report document 
      65.         oRpt.Load("C:
    Inetpub
    wwwroot
    cryreport
    CrystalReport1.rpt"); 
      66.  
      67.         // getting the database, the table and the LogOnInfo object which holds login onformation 
      68.         crDatabase = oRpt.Database; 
      69.  
      70.         // getting the table in an object array of one item 
      71.         object[] arrTables = new object[1]; 
      72.         crDatabase.Tables.CopyTo(arrTables, 0); 
      73.  
      74.         // assigning the first item of array to crTable by downcasting the object to Table 
      75.         crTable = (CrystalDecisions.CrystalReports.Engine.Table)arrTables[0]; 
      76.  
      77.         dbConn = crTable.LogOnInfo; 
      78.  
      79.         // setting values 
      80.         dbConn.ConnectionInfo.DatabaseName = "Xcodf"; 
      81.         dbConn.ConnectionInfo.ServerName = "127.0.0.1"; 
      82.         dbConn.ConnectionInfo.UserID = "HR777"; 
      83.         dbConn.ConnectionInfo.Password = "zghshshs"; 
      84.  
      85.         // applying login info to the table object 
      86.         crTable.ApplyLogOnInfo(dbConn); 
      87.  
      88.  
      89.  
      90.  
      91.  
      92.  
      93.         crViewer.RefreshReport(); 
      94.          
      95.                 // defining report source 
      96.         crViewer.ReportSource = oRpt; 
      97.         //CrystalReportSource1.Report = oRpt; 
      98.          
      99.         // so uptill now we have created everything 
    100.         // what remains is to pass parameters to our report, so it 
    101.         // shows only selected records. so calling a method to set 
    102.         // those parameters. 
    103.      setReportParameters();  
    104.     } 
    105.  
    106.     private void setReportParameters() 
    107.     { 
    108.       
    109.         // all the parameter fields will be added to this collection 
    110.         ParameterFields paramFields = new ParameterFields(); 
    111.          //ParameterFieldDefinitions ParaLocationContainer = new ParameterFieldDefinitions(); 
    112.        //ParameterFieldDefinition ParaLocation = new ParameterFieldDefinition(); 
    113.         
    114.         // the parameter fields to be sent to the report 
    115.         ParameterField pfItemId = new ParameterField(); 
    116.         //ParameterField pfStartDate = new ParameterField(); 
    117.         //ParameterField pfEndDate = new ParameterField(); 
    118.  
    119.         // setting the name of parameter fields with wich they will be recieved in report 
    120.        
    121.         pfItemId.ParameterFieldName = "RegionID"; 
    122.  
    123.         //pfStartDate.ParameterFieldName = "StartDate"; 
    124.         //pfEndDate.ParameterFieldName = "EndDate"; 
    125.  
    126.         // the above declared parameter fields accept values as discrete objects 
    127.         // so declaring discrete objects 
    128.         ParameterDiscreteValue dcItemId = new ParameterDiscreteValue(); 
    129.         //ParameterDiscreteValue dcStartDate = new ParameterDiscreteValue(); 
    130.         //ParameterDiscreteValue dcEndDate = new ParameterDiscreteValue(); 
    131.  
    132.         // setting the values of discrete objects 
    133.          
    134.  
    135.           dcItemId.Value = nItemId; 
    136.          
    137.         //dcStartDate.Value = DateTime.Parse(strStartDate); 
    138.         //dcEndDate.Value = DateTime.Parse(strEndDate); 
    139.          
    140.         // now adding these discrete values to parameters 
    141.           //paramField.HasCurrentValue = true; 
    142.  
    143.        
    144.  
    145.           //pfItemId.CurrentValues.Clear(); 
    146.          int valueIDD = int.Parse(Request.QueryString.Get("ItemId").ToString()); 
    147.           pfItemId.Name = valueIDD.ToString();  
    148.            
    149.         pfItemId.CurrentValues.Add(dcItemId); 
    150.         //ParaLocation.ApplyCurrentValues; 
    151.         pfItemId.HasCurrentValue = true; 
    152.         
    153.         //pfStartDate.CurrentValues.Add(dcStartDate); 
    154.         //pfEndDate.CurrentValues.Add(dcEndDate); 
    155.  
    156.         // now adding all these parameter fields to the parameter collection 
    157.         paramFields.Add(pfItemId); 
    158.          
    159.         //paramFields.Add(pfStartDate); 
    160.         //paramFields.Add(pfEndDate); 
    161.         ///////////////////// 
    162.         //Formula from Crystal 
    163.        //crViewer.SelectionFormula = "{COUNTRIES.REGION_ID} = " + int.Parse(Request.QueryString.Get("ItemId")) + ""; 
    164.         crViewer.RefreshReport(); 
    165.         // finally add the parameter collection to the crystal report viewer 
    166.         crViewer.ParameterFieldInfo = paramFields; 
    167.         
    168.          
    169.      
    170.     } 
    171. }

    Keep your post to under 1200 characters, else you loose the formatting. (you can do two posts if need be).
    Re. parameters. First, make sure yo have SP 1 for CR 10.5:
    https://smpdl.sap-ag.de/~sapidp/012002523100009351512008E/crbasic2008sp1.exe
    Next, see the following:
    [Crystal Reports for Visual Studio 2005 Walkthroughs|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2081b4d9-6864-2b10-f49d-918baefc7a23]
    CR Dev help file:
    http://msdn2.microsoft.com/en-us/library/bb126227.aspx
    Samples:
    https://wiki.sdn.sap.com/wiki/display/BOBJ/CrystalReportsfor.NETSDK+Samples
    Ludek
    Follow us on Twitter http://twitter.com/SAPCRNetSup

  • How to avoid synchronization problem in a  JSP webpage

    HI,
    In my web page im facing a problem like if two user are workin, by that time one's page is getting reflected in other's page.
    Can anyone tell me how to avoid this synchronization problem?
    Thanx a lot

    Hi Rajesh,
    we Have Configured LDAP tree Successfully....the Problem is the Role Assignment iview under Portal Admin is not appearing....if we check the Test Connection with the  users Configured with LDAP in Authentication server TAB under user configuration,it is showing success message...but at the same time we are unable to login with the same user in to the Portal..we are working with EP 5.0....any help can be appreciated
    Regards
    Sudhir

  • CUP: How to determine Performance Problem

    Hi All,
    A user have complained that he was getting slow response from GRC server (from CUP as  he was accessing this application). Howerver, other user were able to successfully perform their actions without performance problem. May I know the  best way to find the response times for all the users on a specifc day? I looked into the NWA->Java monitorng->users. Howerver, somehow I am finding information for different dates. I put my desired dates in "Custom" time period, but still I am not getting the details for these dates. Can any one help me in finding these details?
    Regards,
    Faisal

    Faisal,
    If you have solution manager installed on your landscape you can connect GRC as a managed system. If you configure Wily Introscope you have the option to view various graphics regarding performance of your GRC modules. below I upload a picture to clarify what it's look like:
    http://imageshack.us/photo/my-images/526/wilyo.jpg
    I know you probably are looking for something straightforward, but if you have to face with this problems often Wily could be a good tool for you.
    Regards,
    Diego.

  • How to diagnose performance problems?

    Hi all,
    I'm trying to run some basic performance tests of our app with Coherence, and I'm getting some pretty miserable results. I'm obviously missing something very basic about the configuration, but I can't figure out what it is.
    For our test app, I'm using 3 machines, all Mac Pros running OSX 10.5. Each JVM is configured with 1G RAM and is running in server mode. We're connected on a gigabit LAN (I ran the datagram test, we get about 108Mb/sec but with high packet loss, around 10-12%). Two of the nodes are storage enabled, the other runs as a client performing an approximation of one of the operations we perform routinely.
    The test consists of running a single operation many times. We're developing a card processing system, so I create a bunch of cards (10,000 by default), each of which has about 10 related objects in different caches. The objects are serialised using POF. I write all these to the caches, then perform a single operation for each card. The operation consists of about 14 reads and 6 writes. I can do this about 6-6.5 times per second, for a total of ~120 Coherence ops/sec, which seems extremely low.
    During the test, the network tops out at about 1.2M/s, and the CPU at about 35% on the storage nodes and almost unnoticeable on the client. There's clearly something blocking somewhere, but I can't find what it is. The client uses a thread pool of 100 threads and has 100 threads assigned to its distributed service, and the storage nodes also have 100 threads assigned to their distributed services. I also tried giving more threads to the invocation service (since we use VersionedPut a lot), but that didn't seem to help either. I've also created the required indexes.
    Any help in diagnosing what the problem would be greatly appreciated. I've attached the config used for the two types of node below.
    Cheers,
    Colin
    Config for storage nodes:
    -Dtangosol.coherence.wka=10.1.1.2
    -Dtangosol.coherence.override=%HOME%/coherence.override.xml
    -Dtangosol.coherence.distributed.localstorage=true
    -Dtangosol.coherence.wkaport=8088
    -Dtangosol.coherence.distributed.threads=100
    <cache-config>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>*</cache-name>
    <scheme-name>distributed-scheme</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <distributed-scheme>
    <scheme-name>distributed-scheme</scheme-name>
    <service-name>DistributedCache</service-name>
    <backing-map-scheme>
    <local-scheme>
    <scheme-ref>distributed-backing-map</scheme-ref>
    </local-scheme>
    </backing-map-scheme>
    <autostart>true</autostart>
    <serializer>
    <class-name>runtime.util.protobuf.ProtobufPofContext</class-name>
    </serializer>
    <backup-count>1</backup-count>
    </distributed-scheme>
    <local-scheme>
    <scheme-name>distributed-backing-map</scheme-name>
    <listener>
    <class-scheme>
    <class-name>service.data.manager.coherence.impl.utilfile.CoherenceBackingMapListener</class-name>
    <init-params>
    <init-param>
    <param-type>com.tangosol.net.BackingMapManagerContext</param-type>
    <param-value>{manager-context}</param-value>
    </init-param>
    <init-param>
    <param-type>java.lang.String</param-type>
    <param-value>{cache-name}</param-value>
    </init-param>
    </init-params>
    </class-scheme>
    </listener>
    </local-scheme>
    <invocation-scheme>
    <service-name>InvocationService</service-name>
    <autostart>true</autostart>
    </invocation-scheme>
    </caching-schemes>
    </cache-config>
    Config for client node:
    -Dtangosol.coherence.wka=10.1.1.2
    -Dtangosol.coherence.wkaport=8088
    -Dtangosol.coherence.distributed.localstorage=false
    <cache-config>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>*</cache-name>
    <scheme-name>distributed-scheme</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <distributed-scheme>
    <scheme-name>distributed-scheme</scheme-name>
    <service-name>DistributedCache</service-name>
    <thread-count>100</thread-count>
    <backing-map-scheme>
    <local-scheme>
    <scheme-ref>distributed-backing-map</scheme-ref>
    </local-scheme>
    </backing-map-scheme>
    <autostart>true</autostart>
    <backup-count>1</backup-count>
    </distributed-scheme>
    <local-scheme>
    <scheme-name>distributed-backing-map</scheme-name>
    </local-scheme>
    <invocation-scheme>
    <service-name>InvocationService</service-name>
    <autostart>true</autostart>
    </invocation-scheme>
    </caching-schemes>
    </cache-config>

    Hi David,
    Thanks for the quick response. I'm currently using an executor with 100 threads on the client. I thought that might not be enough since the CPU on the client hardly moves at all, but bumping that up to 1000 didn't change things. If I run the same code with a single Coherence instance in my local machine I get around 110/sec, so I suspect that somewhere in the transport layer something doesn't have enough threads.
    The code is somewhat complicated because we have an abstraction layer on top of Coherence. I'll try to post the relevant parts below:
    The main loop is pretty straightforward. This is a Runnable that I post to the client executor:
    public void run()
    Card card = read(dataManager, Card.class, "cardNumber", cardNumber);
    assert card != null;
    Group group = read(dataManager, Group.class, "id", card.getGroupId());
    assert group != null;
    Account account = read(dataManager, Account.class, "groupId", group.getId());
    assert account != null;
    User user = read(dataManager, User.class, "groupId", group.getId());
    assert user != null;
    ClientUser clientUser = read(dataManager, ClientUser.class, "userId", user.getId());
    HoldLog holdLog = createHoldLog(group, account);
    account.getCurrentHolds().add(holdLog);
    account.setUpdated(now());
    write(dataManager, HoldLog.class, holdLog);
    ClientUser clientUser2 = read(dataManager, ClientUser.class, "userId", user.getId());
    write(dataManager, Account.class, account);
    NetworkMessage networkMessage = createNetworkMessage(message, card, group, holdLog);
    write(dataManager, NetworkMessage.class, networkMessage);
    read does a bit of juggling with our abstraction layer, basically this just gets a ValueExtractor for a named field and then creates an EqualsFilter with it:
    private static <V extends Identifiable> V read(DataManager dataManager,
    Class<V> valueClass,
    String keyField,
    Object keyValue)
    DataMap<Identifiable> dataMap = dataManager.getDataMap(valueClass.getSimpleName(), valueClass);
    FilterFactory filterFactory = dataManager.getFilterFactory();
    ValueExtractorFactory extractorFactory = dataManager.getValueExtractorFactory();
    Filter<Identifiable> filter = filterFactory.equals(extractorFactory.fieldExtractor(valueClass, keyField), keyValue);
    Set<Map.Entry<GUID, Identifiable>> entries = dataMap.entrySet(filter);
    if (entries.isEmpty())
    return null;
    assert entries.size() == 1 : "Expected single entry, found " + entries.size() + " for " + valueClass.getSimpleName();
    return valueClass.cast(entries.iterator().next().getValue());
    write is trivial:
    private static <V extends Identifiable> void write(DataManager dataManager, Class<V> valueClass, V value)
    DataMap<Identifiable> dataMap = dataManager.getDataMap(valueClass.getSimpleName(), valueClass);
    dataMap.put(value.getId(), value);
    And entrySet directly wraps the Coherence call:
    public Set<Entry<GUID, V>> entrySet(Filter<V> filter)
    validateFilter(filter);
    return wrapEntrySet(dataMap.entrySet(((CoherenceFilterAdapter<V>) filter).getCoherenceFilter()));
    This just returns a Map.EntrySet implementation that decodes the binary encoded objects.
    I'm not really sure how much more code to post - what I have is really just a thin layer over Coherence, and I'm reasonably sure that the main problem isn't there since the CPU hardly moves on the client. I suspect thread starvation at some point, but I'm not sure where to look.
    Thanks,
    Colin

  • How to avoid this problem "tablespace 'USERS' does not exist"

    I dont want to create all source tablespaces in target.
    I have given unlimited tablespace priv to user.Still I get this error
    IMP-00003: ORACLE error 959 encountered
    ORA-00959: tablespace 'USERS' does not exist

    The situation, I understand is as follows:
    1. An original export file contains references to the USERS tablespace
    2. You want to import it, but you don't want to create the USERS tablespace
    3. You create a user and assign USERS as its default tablespace, even though you don't have it created and you don't want to create it.
    If you assign a default tablespace to a user, first thing, make sure this tablespace exists, when import is performed if tablespace defined at import file doesn't exist, then import will use the default user's tablespace, in this case the USERS tablespace.
    So, if you want to avoid this behaviour, then assign the user an existing tablespace when defining the default tablespace for this user.
    Once corrected, retry import.
    ~ Madrid

  • How to Avoid  delegatedplugin problem in EP 5.0

    Hi,
    we are working on EP 5.0 ...now the users, who are created in LDAP's are unable to log in to portal...and at the same time the role assignment View(which is under Portal admin) is not appearing....if we check the preview of that iview under portal_admin(Role)..Portaladmin...... which is showing some java runtime exception as fallows... we Checked all the files Related to Role Assignment are loaded and located exactly in to Folders.....any help ..appreciated....
    regards
    sudhir
    Log as fallows:........
    com.sapportals.portal.prt.runtime.PortalRuntimeException: delegatedplugin
    Caused by: com.sapportals.portal.prt.runtime.ProfileNotFoundException: Profile not found, please check the profile path : /roles/portal_admin/com.sapportals.portal.default_admin_lnk_wset_1/portal_admin/nf/com.sapportals.portal.RoleAssignment_lnk_exsrv_4
         at com.sapportals.portal.prt.pcdservice.local.PortalComponentProfile.getComponentProfile(PortalComponentProfile.java:95)
         at com.sapportals.portal.prt.pcdservice.PCDService.switchToLocalMode(PCDService.java:367)
         at com.sapportals.portal.prt.pcdservice.PCDService.getComponentProfile(PCDService.java:354)
         at com.sapportals.portal.prt.core.broker.PortalComponentContextItem.refresh(PortalComponentContextItem.java:130)
         at com.sapportals.portal.prt.core.broker.PortalComponentContextItem.getContext(PortalComponentContextItem.java:192)
         at com.sapportals.portal.prt.session.PortalUserSession.getComponentContext(PortalUserSession.java:68)
         at com.sapportals.portal.prt.component.PortalComponentRequest.getComponentContext(PortalComponentRequest.java:288)
         at com.sapportals.portal.prt.portalconnection.sapnative.DelegatedPlugIn.handleRequest(DelegatedPlugIn.java:375)
         at com.sapportals.portal.prt.portalconnection.sapnative.PortalPlugIn.handleRequest(PortalPlugIn.java:132)
         at com.sapportals.portal.prt.dispatcher.Dispatcher.service(Dispatcher.java:635)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:1264)
         at com.inqmy.services.servlets_jsp.server.InvokerServlet.service(InvokerServlet.java:132)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:1264)
         at com.inqmy.services.servlets_jsp.server.RunServlet.runSerlvet(RunServlet.java:150)
         at com.inqmy.services.servlets_jsp.server.ServletsAndJspImpl.startServlet(ServletsAndJspImpl.java:699)
         at com.inqmy.services.httpserver.server.RequestAnalizer.checkFilename(RequestAnalizer.java:501)
         at com.inqmy.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:199)
         at com.inqmy.services.httpserver.server.Response.handle(Response.java:159)
         at com.inqmy.services.httpserver.server.HttpServerFrame.request(HttpServerFrame.java:802)
         at com.inqmy.core.service.context.container.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:36)
         at com.inqmy.core.cluster.impl3.ParserRunner.run(ParserRunner.java:30)
         at com.inqmy.core.thread.impl0.ActionObject.run(ActionObject.java:45)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.inqmy.core.thread.impl0.SingleThread.run(SingleThread.java:126)

    Hi Rajesh,
    we Have Configured LDAP tree Successfully....the Problem is the Role Assignment iview under Portal Admin is not appearing....if we check the Test Connection with the  users Configured with LDAP in Authentication server TAB under user configuration,it is showing success message...but at the same time we are unable to login with the same user in to the Portal..we are working with EP 5.0....any help can be appreciated
    Regards
    Sudhir

  • The question is what and how to avoid this problem

    hi,
    timesten 7.05, host1:134.133.1.71 tt1,host2:134.133.1.70 tt2;
    oracle db:RAC
    Host2(tt2) drop in the cache group after group create cache after a while, timesten offline the;
    error_log:
    23:12:14.40 Err : REP: 10422: HB_OCS:transmitter.c(3048): TT16121: Failed to flush transaction queue. Restarting log read loo
    p
    23:12:14.40 Err : REP: 10422: HB_OCS:transmitter.c(3048): TT16121: Failed to flush transaction queue. Restarting log read loo
    p
    23:12:14.40 Warn: REP: 10422: HB_OCS:receiver.c(1931): TT16060: Failed to read data from the network. TimesTen replication ag
    ent is stopping
    23:12:14.51 Warn: REP: 10422: HB_OCS:receiver.c(1931): TT16060: Failed to read data from the network. TimesTen replication ag
    ent is stopping
    23:13:27.09 Err : ORA: 9592: ora-9592-0009-raStuff09836: Unexpected row count. Expecting 1. Got 0.
    23:49:25.50 Warn: REP: 9651: HB_OCS:receiver.c(1931): TT16060: Failed to read data from the network. select() timed out
    23:49:30.01 Warn: REP: 9651: HB_OCS:transmitter.c(5028): TT16060: Failed to read data from the network. select() timed out
    23:49:30.01 Err : REP: 9651: HB_OCS:transmitter.c(3048): TT16121: Failed to flush transaction queue. Restarting log read loo
    p
    23:49:53.54 Warn: REP: 9651: HB_OCS:receiver.c(1931): TT16060: Failed to read data from the network. TimesTen replication ag
    ent is stopping
    23:54:40.14 Warn: : 6151: 7759 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=10
    23:54:40.27 Warn: : 6151: 7546 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=7
    23:54:40.38 Warn: : 6151: 6137 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=7
    23:54:40.56 Warn: : 6151: 5029 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=10
    23:54:40.67 Warn: : 6151: 6367 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=10
    23:54:40.70 Warn: : 6151: 6249 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=7
    23:54:40.81 Warn: : 6151: 5150 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=7
    23:54:40.92 Warn: : 6151: 6016 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=10
    23:54:44.70 Warn: : 6151: 8168 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=7
    23:54:44.80 Warn: : 6151: 8053 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=4
    23:54:44.91 Warn: : 6151: 4869 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 757334180 count=10
    00:00:54.02 Err : : 6151: 6165/60000000000ae160: Validity Error (pgDir.c: 2992) sbDirNodeIsValid: Page 13340614272 is ful
    l but found in non-full list
    00:00:54.02 Err : : 6151: 6165/60000000000ae160: Validity Error (pgDir.c: 2726) sbPgDirIsValid: Node (id 2556776512, ix 9
    ) is not valid
    00:00:54.02 Err : : 6151: 6165/60000000000ae160: Validity Error (table.c: 29707) sbTblIsValid: Table has invalid physical
    page dir (id 4611686017073389424)
    00:00:54.02 Err : : 6151: 6165/60000000000ae160: Validity Error (db.c: 29151) sbDbIsValid(): sbTblIsValid() failed for ta
    ble 735848
    00:00:54.04 Err : : 6151: 6165/60000000000ae160: Data store marked invalid [db.c:4.1598:sbDbCkpt:19087] PID 6165 (timeste
    nsubd) CONN 2042 (Worker) Context 0x60000000000ae160
    00:00:54.20 Err : : 6151: 6165/60000000000ae160: Checkpoint failure (db.c, line 19089). 1 errors/warnings follow.
    00:00:54.20 Err : : 6151: 6165/60000000000ae160: TT0994: Data store connection terminated. Please reconnect. -- file "db.
    c", lineno 19089, procedure "sbDbCkpt"
    00:00:54.20 Err : : 6151: TT14006: TimesTen daemon disconnect failed: 6165: Disconnect failure reported by client
    00:00:54.21 Err : : 6165: subd: Error identified in [sub.c: line 4301]
    00:00:54.21 Err : : 6165: subd: (Warning 994): TT0994: Data store connection terminated. Please reconnect. -- file "db.c"
    , lineno 19089, procedure "sbDbCkpt"
    00:00:54.21 Err : : 6165: file "db.c", lineno 19089, procedure "sbDbCkpt"
    00:00:54.21 Err : : 6165: subd: (Warning 601): TT0601: Checkpoint failure -- file "db.c", lineno 15587, procedure "sbDbDi
    sconnect"
    00:00:54.21 Err : : 6165: file "db.c", lineno 15587, procedure "sbDbDisconnect"
    00:00:54.21 Err : : 6165: subd: (Error 994): TT0994: Data store connection terminated. Please reconnect. -- file "dbAPI.c
    ", lineno 3178, procedure "sb_dbDisconnect()"
    00:00:54.21 Err : : 6165: file "dbAPI.c", lineno 3178, procedure "sb_dbDisconnect()"
    00:00:54.21 Err : : 6165: subd: disconnect failed rc 994
    00:00:54.22 Err : : 6151: TT14000: TimesTen daemon internal error: 6165: Error processing 'unmanage' in subdaemon rc 400
    00:00:54.22 Err : : 6151: TT14000: TimesTen daemon internal error: Error in subdaemon sb_err_t 994
    00:00:54.22 Err : : 6151: TT14006: TimesTen daemon disconnect failed: error -1 in stopManaging
    00:00:54.29 Warn: : 6151: 6165 ------------------: subdaemon process exited
    00:03:55.54 Err : : 6151: 16745/60000000000ae160: Validity Error (pgDir.c: 2992) sbDirNodeIsValid: Page 13340614272 is fu
    ll but found in non-full list
    00:03:55.54 Err : : 6151: 16745/60000000000ae160: Validity Error (pgDir.c: 2726) sbPgDirIsValid: Node (id 2556776512, ix
    9) is not valid
    00:03:55.54 Err : : 6151: 16745/60000000000ae160: Validity Error (table.c: 29707) sbTblIsValid: Table has invalid physica
    l page dir (id 4611686017073389424)
    00:03:55.54 Err : : 6151: 16745/60000000000ae160: Validity Error (db.c: 29151) sbDbIsValid(): sbTblIsValid() failed for t
    able 735848
    00:03:55.55 Err : : 6151: 16745/60000000000ae160: Data store marked invalid [db.c:4.1598:sbDbCkpt:19087] PID 16745 (times
    tensubd) CONN 2042 (Worker) Context 0x60000000000ae160
    00:03:55.60 Err : : 6151: 16745/60000000000ae160: Checkpoint failure (db.c, line 19089). 1 errors/warnings follow.
    00:03:55.72 Err : : 6151: 16745/60000000000ae160: TT0994: Data store connection terminated. Please reconnect. -- file "db
    .c", lineno 19089, procedure "sbDbCkpt"
    00:03:55.72 Err : : 6151: TT14006: TimesTen daemon disconnect failed: 16745: Disconnect failure reported by client
    00:03:55.72 Err : : 16745: subd: Error identified in [sub.c: line 4301]
    00:03:55.72 Err : : 16745: subd: (Warning 994): TT0994: Data store connection terminated. Please reconnect. -- file "db.c"
    , lineno 19089, procedure "sbDbCkpt"
    00:03:55.72 Err : : 16745: file "db.c", lineno 19089, procedure "sbDbCkpt"
    00:03:55.72 Err : : 16745: subd: (Warning 601): TT0601: Checkpoint failure -- file "db.c", lineno 15587, procedure "sbDbDi
    sconnect"
    00:03:55.72 Err : : 16745: file "db.c", lineno 15587, procedure "sbDbDisconnect"
    00:03:55.72 Err : : 16745: subd: (Error 994): TT0994: Data store connection terminated. Please reconnect. -- file "dbAPI.c
    ", lineno 3178, procedure "sb_dbDisconnect()"
    00:03:55.72 Err : : 16745: file "dbAPI.c", lineno 3178, procedure "sb_dbDisconnect()"
    00:03:55.72 Err : : 16745: subd: disconnect failed rc 994
    00:03:55.72 Err : : 6151: TT14000: TimesTen daemon internal error: 16745: Error processing 'unmanage' in subdaemon rc 400
    00:03:55.72 Err : : 6151: TT14000: TimesTen daemon internal error: Error in subdaemon sb_err_t 994
    00:03:55.72 Err : : 6151: TT14006: TimesTen daemon disconnect failed: error -1 in stopManaging
    00:03:56.01 Warn: : 6151: 16745 ------------------: subdaemon process exited
    01:17:18.05 Err : ORA: 27108: ora-27108-0009-raStuff09836: Unexpected row count. Expecting 1. Got 0.
    03:12:21.36 Warn: ORA: 27108: ora-27108-0013-refresh10886: Autorefresh was not able to acquire lock on one of the cache group
    s, may be because a DDL transaction is open on the cache group. Autorefresh will be retried again
    03:27:40.23 Err : ORA: 27108: ora-27108-0013-raStuff09836: Unexpected row count. Expecting 1. Got 0.
    03:41:47.35 Err : ORA: 27108: ora-27108-0013-Refresh03385: The sequence number for table HB_ODS.TB_BIL_PRESENT_RESOURCE had c
    hanged but the table was not refreshed. The cache group in datastore /tt_data/hb_ocs/hb_ocs is out of sync.

    03:43:50.73 Warn: : 6151: 24045 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=7
    03:43:50.82 Warn: : 6151: 26722 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=7
    03:43:50.93 Warn: : 6151: 25163 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=10
    03:43:51.04 Warn: : 6151: 23847 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=10
    03:43:51.15 Warn: : 6151: 25355 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=10
    03:43:55.18 Warn: : 6151: 23754 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=7
    03:43:55.40 Warn: : 6151: 24314 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=7
    03:43:55.51 Warn: : 6151: 22900 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=10
    03:43:55.54 Warn: : 6151: 25258 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=7
    03:43:55.69 Warn: : 6151: 25638 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=4
    03:43:55.76 Warn: : 6151: 22642 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=10
    08:46:42.07 Err : : 6151: 26060/6000000000b20020: Assertion failed: !(((((((pgp))->slotBusy))[((sbBmapIx_t) (((sbBmap_t)
    (((slot)))) >> ((sb_uintp) 6) ))]) & ((sbBmap_t) (((sb_uintp)(1)) << (((sbBmapIx_t) ((slot))) & ((sb_uintp)(((sb_uintp) 64)-1
    ))))))) || SbContextP->inOpLogUndo [tupPage.c:4.146:sbTpgSlotActivate:959] PID 26060 (timestenrepd) CONN 104 (RECEIVER) 2010-
    05-28 08:46:42.069
    08:46:42.09 Err : : 6151: 26060/6000000000b20020: Data store marked invalid [tupPage.c:4.146:sbTpgSlotActivate:959] PID 2
    6060 (timestenrepd) CONN 104 (RECEIVER) Context 0x6000000000b20020
    08:46:42.14 Warn: : 6151: 26060/6000000000b20020: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.16 Err : REP: 26060: HB_OCS:receiver.c(1095): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.16 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.22 Err : REP: 26060: HB_OCS:receiver.c(10091): TT16084: Table: HB_ODS.SESSION_INFORMATION. Failed to insert row for
    'insert'
    08:46:42.22 Err : REP: 26060: HB_OCS:receiver.c(10091): TT994: TT0994: Data store connection terminated. Please reconnect. --
    file "tupPage.c", lineno 959, procedure "sbTpgSlotActivate"
    08:46:42.22 Err : REP: 26060: HB_OCS:receiver.c(10091): TT4053: TT4053: Internal error: Assertion failed: !(((((((pgp))->slot
    Busy))[((sbBmapIx_t) (((sbBmap_t) (((slot)))) >> ((sb_uintp) 6) ))]) & ((sbBmap_t) (((sb_uintp)(1)) << (((sbBmapIx_t) ((slot)
    )) & ((sb_uintp)(((sb_uintp) 64)-1))))))) || SbContextP->inOpLogUndo [tupPage.c:4.146:sbTpgSlotActivate:959] PID 26060 (times
    tenrepd) CONN 104 (RECEIVER) 2010-05-28 08:46:42.069 -- file "util.c", lineno 1857, procedure "sbUtAssertionReport"
    08:46:42.22 Warn: : 6151: 26060/60000000004c3ff0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.22 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.22 Err : : 6151: 26060/0000000000000000: 26060: Failed to send SNMP trap type 9
    08:46:42.22 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.22 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'RECEIVER' thread
    08:46:42.22 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.22 Warn: : 6151: 26060/6000000000443ff0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.22 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.22 Warn: : 6151: 26060/60000000001960e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.22 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.22 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'LOGFORCE' thread
    08:46:42.22 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.22 Warn: : 6151: 418/6000000000245010: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.22 Warn: : 6151: 418 ----------: Disconnecting from an old instance
    08:46:42.23 Err : : 418: TT14000: TimesTen daemon internal error: subd: flusher thread failed in sb_dbLogFlusherSvc, tt
    error 994 (TT0994: Data store connection terminated. Please reconnect. -- file "dbAPI.c", lineno 9535, procedure "sb_dbLogFlu
    sherSvc")
    08:46:42.23 Err : : 418: TT14000: TimesTen daemon internal error: subd: flusher thread failed to disconnect, tt error 0
    (TT0846: Data store connection invalid or not current -- file "dbAPI.c", lineno 3135, procedure "sb_dbDisconnect()").
    08:46:42.24 Warn: : 6151: 4847/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.24 Warn: : 6151: 4847 ----------: Disconnecting from an old instance
    08:46:42.24 Err : REP: 26060: HB_OCS:transmitter.c(8190): TT16012: Data store is invalid. Replication Agent exiting but may b
    e restarted by Timesten daemon (depending on restart policy)
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'TRANSMITTER' thread
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.24 Err : REP: 26060: HB_OCS:transmitter.c(8190): TT16012: Data store is invalid. Replication Agent exiting but may b
    e restarted by Timesten daemon (depending on restart policy)
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.24 Warn: REP: 26060: HB_OCS:receiver.c(1931): TT16060: Failed to read data from the network. TimesTen replication ag
    ent is stopping
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'TRANSMITTER' thread
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.24 Warn: : 418: Stopping subdaemon monitor for /tt_data/hb_ocs/hb_ocs because db is invalid.
    08:46:42.24 Warn: : 6151: 418/6000000000440020: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.24 Warn: : 6151: 418 ----------: Disconnecting from an old instance
    08:46:42.24 Err : : 418: TT14000: TimesTen daemon internal error: subd: monitor thread failed in sb_dbCgExistWithOraObj,
    tt error 994
    08:46:42.24 Err : : 418: TT14000: TimesTen daemon internal error: subd: monitor thread failed to disconnect, tt error 84
    6.
    08:46:42.24 Warn: : 6151: 26060/6000000000723ff0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.24 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'RECEIVER' thread
    08:46:42.24 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.35 Err : REP: 26060: HB_OCS:rephold.c(141): TT16012: Data store is invalid. Replication Agent exiting but may be res
    tarted by Timesten daemon (depending on restart policy)
    08:46:42.35 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.35 Warn: : 6151: 26060/60000000002a3ff0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.35 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.35 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'REPHOLD' thread
    08:46:42.35 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.43 Warn: : 6151: 418 ------------------: subdaemon process exited
    08:46:42.43 Warn: : 6151: 418 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=4
    08:46:42.43 Warn: : 6151: daRecovery: subdaemon 418, managing data store, failed: invalidate (failcode=202)
    08:46:42.48 Warn: : 6151: 3508/60000000004f68e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.48 Warn: : 6151: 3508 ----------: Disconnecting from an old instance
    08:46:42.50 Warn: : 6151: 5775/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.50 Warn: : 6151: 5775 ----------: Disconnecting from an old instance
    08:46:42.51 Warn: : 6151: 5395/60000000004f68e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.51 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:42.66 Err : REP: 26060: HB_OCS:receiver.c(1637): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.67 Err : REP: 26060: HB_OCS:repagent.c(1243): TT16012: Data store is invalid. Replication Agent exiting but may be r
    estarted by Timesten daemon (depending on restart policy)
    08:46:42.67 Warn: : 6151: 26060/6000000000343ff0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.67 Warn: : 6151: 26060 ----------: Disconnecting from an old instance
    08:46:42.67 Err : REP: 26060: HB_OCS:repagent.c(3079): TT16005: Failed to disconnect from datastore '/tt_data/hb_ocs/hb_ocs'
    for 'REPLISTENER' thread
    08:46:42.67 Err : REP: 26060: HB_OCS:repagent.c(3079): TT846: TT0846: Data store connection invalid or not current -- file "d
    bAPI.c", lineno 3135, procedure "sb_dbDisconnect()"
    08:46:42.71 Err : : 6151: repagent says it has failed to start: Data store is invalid. Replication Agent exiting but may
    be restarted by Timesten daemon (depending on restart policy)
    08:46:42.91 Warn: : 6151: 5962/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:42.91 Warn: : 6151: 5962 ----------: Disconnecting from an old instance
    08:46:43.25 Warn: : 6151: 27108/6000000000857040: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:43.25 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:43.27 Warn: : 6151: 27108/60000000006b2620: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:43.27 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:45.39 Warn: : 6151: 1792/60000000004f68e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:45.39 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:46.11 Warn: : 6151: 1910/60000000004f68e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:46.11 Warn: : 6151: 1910 ----------: Disconnecting from an old instance
    08:46:46.28 Warn: : 6151: 2917/60000000004f68e0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:46.28 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:46.44 Warn: : 6151: 2031/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:46.44 Warn: : 6151: 2031 ----------: Disconnecting from an old instance
    08:46:47.05 Warn: : 6151: 3319/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:47.05 Warn: : 6151: 3319 ----------: Disconnecting from an old instance
    08:46:47.06 Warn: : 6151: 3202/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:47.06 Warn: : 6151: 3202 ----------: Disconnecting from an old instance
    08:46:50.24 Warn: : 6151: 27108/6000000000740020: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:50.24 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:50.24 Warn: : 6151: 27108/60000000001822a0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:50.24 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:50.24 Warn: ORA: 27108: ora-27108-0001-xxagent04639: Warning: Statement SQLDisconnect(agentHdbc)
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "dbAPI.c", lineno 3132, procedure "sb_dbDisconnect()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "db.c", lineno 14954, procedure "sbDbAppExit()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:50.24 Warn: : 6151: 27108/60000000003683a0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:50.24 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:50.24 Warn: ORA: 27108: ora-27108-0001-xxagent04645: Warning: Statement SQLDisconnect(timerHdbc)
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "dbAPI.c", lineno 3132, procedure "sb_dbDisconnect()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "db.c", lineno 14954, procedure "sbDbAppExit()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:50.24 Warn: : 6151: 27108/60000000003883a0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:50.24 Warn: : 6151: 27108 ----------: Disconnecting from an old instance
    08:46:50.24 Warn: ORA: 27108: ora-27108-0001-xxagent04652: Warning: Statement SQLDisconnect(agingHdbc)
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "dbAPI.c", lineno 3132, procedure "sb_dbDisconnect()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01017: Error: [TimesTen]TT0994: Data store connection terminated. Please
    reconnect. -- file "db.c", lineno 14954, procedure "sbDbAppExit()", ODBC SQL state = S1000, Additional Warning = 994
    08:46:50.24 Err : ORA: 27108: ora-27108-0001-bcStuff01042: Detected invalid data store.
    08:46:52.30 Warn: : 6151: 4847/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.30 Warn: : 6151: 4847 ----------: Disconnecting from an old instance
    08:46:52.30 Warn: : 6151: 4847/600000000028d530: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.30 Warn: : 6151: 4847 ----------: Disconnecting from an old instance
    08:46:52.30 Warn: : 6151: 4847/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.30 Warn: : 6151: 4847 ----------: Disconnecting from an old instance
    08:46:52.50 Warn: : 6151: 3508/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.50 Warn: : 6151: 3508 ----------: Disconnecting from an old instance
    08:46:52.50 Warn: : 6151: 3508/6000000000395da0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.50 Warn: : 6151: 3508 ----------: Disconnecting from an old instance
    08:46:52.50 Warn: : 6151: 3508/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.50 Warn: : 6151: 3508 ----------: Disconnecting from an old instance
    08:46:52.51 Warn: : 6151: 3508/6000000000446340: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.51 Warn: : 6151: 3508 ----------: Disconnecting from an old instance
    08:46:52.53 Warn: : 6151: 5395/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.53 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:52.53 Warn: : 6151: 5395/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.53 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:52.54 Warn: : 6151: 5395/6000000000446340: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.54 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:52.54 Warn: : 6151: 5395/6000000000395da0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.54 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:52.54 Warn: : 6151: 5395/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.54 Warn: : 6151: 5395 ----------: Disconnecting from an old instance
    08:46:52.93 Warn: : 6151: 5962/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.93 Warn: : 6151: 5962 ----------: Disconnecting from an old instance
    08:46:52.93 Warn: : 6151: 5962/600000000028d530: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.93 Warn: : 6151: 5962 ----------: Disconnecting from an old instance
    08:46:52.93 Warn: : 6151: 5962/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:52.93 Warn: : 6151: 5962 ----------: Disconnecting from an old instance
    08:46:55.41 Warn: : 6151: 1792/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:55.41 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:55.41 Warn: : 6151: 1792/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:55.41 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:55.42 Warn: : 6151: 1792/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:55.42 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:55.42 Warn: : 6151: 1792/6000000000395da0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:55.42 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:55.42 Warn: : 6151: 1792/6000000000446340: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:55.42 Warn: : 6151: 1792 ----------: Disconnecting from an old instance
    08:46:56.13 Warn: : 6151: 1910/6000000000395da0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.13 Warn: : 6151: 1910 ----------: Disconnecting from an old instance
    08:46:56.14 Warn: : 6151: 1910/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.14 Warn: : 6151: 1910 ----------: Disconnecting from an old instance
    08:46:56.14 Warn: : 6151: 1910/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.14 Warn: : 6151: 1910 ----------: Disconnecting from an old instance
    08:46:56.14 Warn: : 6151: 1910/6000000000446340: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.14 Warn: : 6151: 1910 ----------: Disconnecting from an old instance
    08:46:56.30 Warn: : 6151: 2917/6000000000395da0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.30 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:56.30 Warn: : 6151: 2917/6000000000446340: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.30 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:56.31 Warn: : 6151: 2917/60000000003ee070: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.31 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:56.31 Warn: : 6151: 2917/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.31 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:56.31 Warn: : 6151: 2917/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.31 Warn: : 6151: 2917 ----------: Disconnecting from an old instance
    08:46:56.46 Warn: : 6151: 2031/600000000028d530: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.46 Warn: : 6151: 2031 ----------: Disconnecting from an old instance
    08:46:56.47 Warn: : 6151: 2031/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.47 Warn: : 6151: 2031 ----------: Disconnecting from an old instance
    08:46:56.47 Warn: : 6151: 2031/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:56.47 Warn: : 6151: 2031 ----------: Disconnecting from an old instance
    08:46:57.07 Warn: : 6151: 3319/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.07 Warn: : 6151: 3319 ----------: Disconnecting from an old instance
    08:46:57.08 Warn: : 6151: 3319/600000000028d530: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.08 Warn: : 6151: 3319 ----------: Disconnecting from an old instance
    08:46:57.08 Warn: : 6151: 3319/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.08 Warn: : 6151: 3319 ----------: Disconnecting from an old instance
    08:46:57.08 Warn: : 6151: 3202/600000000028d530: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.08 Warn: : 6151: 3202 ----------: Disconnecting from an old instance
    08:46:57.08 Warn: : 6151: 3202/600000000033dad0: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.08 Warn: : 6151: 3202 ----------: Disconnecting from an old instance
    08:46:57.08 Warn: : 6151: 3202/60000000002e5800: Forced Disconnect /tt_data/hb_ocs/hb_ocs
    08:46:57.08 Warn: : 6151: 3202 ----------: Disconnecting from an old instance
    08:47:12.69 Warn: : 6151: 4847 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:47:12.69 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:47:12.69 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:47:20.15 Warn: : 6151: 5395 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=4
    08:47:20.15 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:47:20.15 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:47:33.28 Warn: : 6151: 5775 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:47:33.28 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:47:33.28 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:47:41.16 Warn: : 6151: 1792 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=4
    08:47:41.16 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:47:41.16 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:47:54.76 Warn: : 6151: 2917 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=4
    08:47:54.83 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:47:54.83 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:01.12 Warn: : 6151: 3202 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:48:01.12 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:01.12 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:08.72 Warn: : 6151: 1910 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=5
    08:48:08.72 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:08.72 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:15.47 Warn: : 6151: 3508 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=5
    08:48:15.47 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:15.47 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:26.94 Warn: : 6151: 5962 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:48:26.95 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:26.95 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:35.37 Warn: : 6151: 2031 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:48:35.37 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:35.37 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:48:42.07 Warn: : 6151: 3319 exited while connected to data store '/tt_data/hb_ocs/hb_ocs' shm 1025769636 count=3
    08:48:42.07 Warn: : 6151: 6151: <wait not in flux or invalidate> was waiting in flux, socket dead (was 2=connect pid 2735
    8 nwaiters 1 count 1 ds /tt_data/hb_ocs/hb_ocs)
    08:48:42.07 Err : : 6151: TT14000: TimesTen daemon internal error: recovery: could not wait for not-in-flux
    08:52:00.55 Warn: : 6151: 8203/60000000000ae160: Recovery started
    09:23:11.65 Warn: : 8203: subd: Warning identified in [sub.c: line 3188]
    09:23:11.65 Warn: : 8203: subd: (Warning 20100): TT20100: This connection required recovery due to an improper shutdown -
    - file "db.c", lineno 11566, procedure "sbDbConnect"
    09:23:11.65 Warn: : 8203: file "db.c", lineno 11566, procedure "sbDbConnect"
    09:23:11.65 Warn: : 8203: subd: connect trouble, rc 2, reason 20100
    09:23:11.66 Warn: : 8203: Warn 20100: TT20100: This connection required recovery due to an improper shutdown -- file "db.
    c", lineno 11566, procedure "sbDbConnect"
    09:29:16.19 Err : ORA: 27417: ora-27417-0009-raStuff09836: Unexpected row count. Expecting 1. Got 0.

  • How to avoid image problem when converting to PDF?

    I had placed this yellow watch (psd file) to a AI file. Then I saved it as a PDF. When I opened the PDF file, there was an error on the image.
    Is there anyone can help to solve this problem?

    A few possibilities include stuff like...
    1.)  Image is missing and/or not "Linked".  Look at whether the image has been moved or taken from a server. The solution is to save the image to a local folder and link to it in the document.
    2.)  Color space of the document does not match the color space of the image.  Make sure the image and the document are appropriate color spaces and they match ( i.e., document = CMYK; image = CMYK ).

  • Though installed firefox 10.0, I recently have the application updater.exe running as soon as firefox is launched. The problem disappears when forcefully closing updater.exe, then also firefox calms down. How to avoid this problem?

    It consumes 30% of cpu and takes with it firefox, together consuming more than half of CPU and going on >1/2h.
    This is a new phenomenon which slows down all other activity on my computer.

    After updating to version 10 I closed Firefox.
    I noticed that there was a lot of CPU activity.
    Checked Task Manager and noticed that in Processes, firefox.exe and plug-in container were running. Opened Firefox and closed again but the processes were still running in TM.
    Highlighted firefox.exe in TM and clicked End Process. Both processes disappeared from TM.
    Have opened and closed FF a couple of times since with no problem.
    However, the OP seems to indicate that this is a problem that re-occurs.
    Why is FF not shutting down properly? Is it possibly something to do with the automatic check for add in/extension updates?

  • How to avoid DISTINCT in OBIEE SQL

    Hi...
    For question i posted previously...
    Problem with LONG datatype in Answers
    I run the query in TOAD, and i was identified the error because of what this error is coming... because of DISTINCT...
    I have taken care of avoiding that column in ORDER BY by putting the order by on another column..
    Now my question is how to avoid the DISTINCT clause in SQL generated by OBIEE...
    If the first column in criteria is measure then it avoids the DISTINCT... but if i use measure... group by will come.. and group by should not be used here...
    If this is done... my problem will be resolved...
    Ofcourse... i would be getting duplicate rows...
    Still i want the answer how to avoid DISTINCT?
    Expecting answer from you...
    Thanks & Regards
    Kishore Guggilla

    Hi...
    Thanks for reply...
    Now another question.. if i want to do this for one report... how?/
    If i do that change in Catalog properties.. that will be effected for all reports i think...
    Instead of that if i want to do that for only one report.. which is effecting... then how to proceed??
    I tried by writing case when 1=0 ... for first column in criteria.. but no luck...
    Thanks & Regards
    Kishore Guggilla

  • How to run performance trace on a program

    Hi All,
          I need to run pefromance trace on a program that I will run in background. I think I have to use ST05 to run SQL trace. Could someone please confirm the steps?
    1) Go to ST05
    2) Activate trace with filter
    3) Run the program in background
    I am not sure about the steps after this. How do I view the report while trace is running? Should I wait till the program ends to view the report? It is taking a long time to run. Should I do any other performance analysis other than ST05?
    Thanks.
    Mithun

    Hi Mithun.
    I would like to suggest a few references,
    [SDN Library - Standard Reference - How to analyze performance problems - Trace|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3fbea790-0201-0010-6481-8370ebc3c17d]
    [SDN Library - Standard Reference for Performance Analysis in a Nutshell|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/86a0b490-0201-0010-9cba-fd5c804b99a1]
    [SDN Library - Standard Reference - Best Practices for Performance Tuning|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5d0db4c9-0e01-0010-b68f-9b1408d5f234]
    Hope that's usefull.
    Good Luck & Regards.
    Harsh Dave

  • How to avoid XVBPA line item values while creating sales order

    Hi Friends
    I am creating Sales order using function module IDOC_INPUT_ORDERS, for that I am passing partner details through segment E1EDKA1.If I pass the same partner details(Partner number, name1,name4,city,postel code) which is same like customer master,
    I can able to book the order and entries are coming only one time in VBPA table(Header Item level).But if I change any value for the particular partner in the Input(For example Name4, there is no value in customer master but I added through program).Now its creating an order but its showing the entered partner informations in header level.When I checked the same in line item level, its not picking the values from header, its picking the values from customer master for that particular partner.
    So this line item also coming in VBPA table, because its different from header data.
    Anyone guide me, how to avoid this problem while booking order through program.
    Thanks
    Gowrishankar

    I am updating Ship-To-Party value in segment E1EDKA1 and calling IDOC_INPUT_ORDERS function module to create the sales order.
    In this case for the particular partner there is no NAME4 value in KNA1, I am passing some value through this segment E1EDKA1.
    So its creating an order with this new addresses , so its generating new ADRNR in VBPA header level.Thats fine.But the same time
    its not copying the same value from header to line item.
    In Line item level again its picking the address details from KNA1 for the particular partner number.So its showing Blank value in NAME4 field of line items.Its creating and spllitting issue, while creating delivery.
    Thanks
    Gowrishankar

Maybe you are looking for