Too much time taken to go to sleep

hey there,
i've got problem here which my macbook takes nearly one minute to go to sleep.
it really wasted my time to for it to go to sleep.
your reply is really appreciate.
thank you,
Arif
[new to mac world]

I don't really have any suggestions for the speed, other then to make the code or network connnectivity more efficient if possible. Maybe you should read the Netscape documentation. However, perhaps LDAP isn't the best solution for you. LDAP is optimized for a read-often, write-rarely, type of setup. If you're constantly updating such a large volume of records, maybe a DB would be the best choice.

Similar Messages

  • Too much time taken for Adding/Updating records in netscape LDAP

    Hi,
    I am a newbie in ldap and have a text file with 80,000 users information in it, which i need to add into LDAP.
    I am using netscape LDAP 5.1 as my directory server.
    The time that is being taken to add records into LDAP, is like 3 minutes for each add.
    This is slowing down my batch job terribly, and causing issues with performance.
    Does anybody have a idea or knowledge whether this is correct or things could be speeded up, if yes a few pointers and tips would be highly appreciated.
    Thanks in advance
    piyush_ast.

    I don't really have any suggestions for the speed, other then to make the code or network connnectivity more efficient if possible. Maybe you should read the Netscape documentation. However, perhaps LDAP isn't the best solution for you. LDAP is optimized for a read-often, write-rarely, type of setup. If you're constantly updating such a large volume of records, maybe a DB would be the best choice.

  • SL and Lion too much time to sleep!

    Hi,
    my mac is taking too much time to sleep, even with the change from SL to Lion...
    it always takes like 45sec with google chrome on and 25sec +- with it off...
    this are the pmset -g log that have the biggest time, I'm running latest google chrome dev:
    * Domain: applicationresponse.slowresponse
    - Message: Kernel powerd com.apple.powermanagement.applicationresponse.slowresponse 15928 ms
    - Time: 7/23/11 4:20:41 PM GMT+
    - Signature: powerd
    - UUID: D1A5A219-ABDD-41D3-9F95-7098470C88E6
    - Result: Noop
    - Response time (ms): 15928
    * Domain: applicationresponse.slowresponse
    - Message: PMConnection mDNSResponder com.apple.powermanagement.applicationresponse.slowresponse 11005 ms
    - Time: 7/23/11 5:26:16 PM GMT+
    - Signature: mDNSResponder
    - UUID: 895080D9-D546-42CD-B829-8AFC55DD0442
    - Result: Noop
    - Response time (ms): 11005
    * Domain: applicationresponse.timedout
    - Message: Kernel Google Chrome He com.apple.powermanagement.applicationresponse.timedout 30000 ms
    - Time: 7/23/11 6:53:43 PM GMT+
    - Signature: Google Chrome He
    - UUID: 895080D9-D546-42CD-B829-8AFC55DD0442
    - Result: Noop
    - Response time (ms): 30000
    google chrome only have 4 tabs open and the plugins running are adblock and flashblock.
    any help?

    any help?
    still can't find anything, the problem occuered under SL too so i guess it is not a Lion issue
    another "bug" that happens on the both OS is that when i close the lid the apple simbol lights off but sometimes the mac symbol lights up again until the mac goes to sleep

  • During the Unicode conversion , Cluster table export taken too much time ap

    Dear All
    during the Unicode conversion , Cluster table export taken too much time approximately 24 hours of 6 tables , could you please advise , how can  we   reduse  the time
    thanks
    Jainnedra

    Hello,
    Use latest R3load from market place.
    also refer note
    Note 1019362 - Very long run times during SPUMG scans
    Regards,
    Nitin Salunkhe

  • Taking too much time using BufferedWriter to write to a file

    Hi,
    I'm using the method extractItems() which is given below to write data to a file. This method is taking too much time to execute when the number of records in the enumeration is 10000 and above. To be precise it takes around 70 minutes. The writing pauses intermittently for 20 seconds after writing a few lines and sometimes for much more. Has somebody faced this problem before and if so what could be the problem. This is a very high priority work and it would be really helpful if someone could give me some info on this.
    Thanks in advance.
    public String extractItems() throws InternalServerException{
    try{
                   String extractFileName = getExtractFileName();
                   FileWriter fileWriter = new FileWriter(extractFileName);
                   BufferedWriter bufferWrt = new BufferedWriter(fileWriter);
                   CXBusinessClassIfc editClass = new ExploreClassImpl(className, mdlMgr );
    System.out.println("Before -1");
                   CXPropertyInfoIfc[] propInfo = editClass.getClassPropertyInfo(configName);
    System.out.println("After -1");
              PrintWriter out = new PrintWriter(bufferWrt);
    System.out.println("Before -2");
              TemplateHeaderInfo.printHeaderInfo(propInfo, out, mdlMgr);
    System.out.println("After -2");
    XDItemSet itemSet = getItemsForObjectIds(catalogEditDO.getSelectedItems());
    Enumeration allitems = itemSet.allItems();
    System.out.println("the batch size : " +itemSet.getBatchSize());
    XDForm frm = itemSet.getXDForm();
    XDFormProperty[] props = frm.getXDFormProperties();
    System.out.println("Before -3");
    bufferWrt.newLine();
    long startTime ,startTime1 ,startTime2 ,startTime3;
    startTime = System.currentTimeMillis();
    System.out.println("time here is--before-while : " +startTime);
    while(allitems.hasMoreElements()){
    String aRow = "";
    XDItem item = (XDItem)allitems.nextElement();
    for(int i =0 ; i < props.length; i++){
         String value = item.getStringValue(props);
         if(value == null || value.equalsIgnoreCase("null"))
              value = "";
                             if(i == 0)
                                  aRow = value;
                             else
                                  aRow += ("\t" + value);
    startTime1 = System.currentTimeMillis();
    System.out.println("time here is--before-writing to buffer --new: " +startTime1);
    bufferWrt.write(aRow.toCharArray());
    bufferWrt.flush();//added by rosmon to check extra time taken for extraction//
    bufferWrt.newLine();
    startTime2 = System.currentTimeMillis();
    System.out.println("time here is--after-writing to buffer : " +startTime2);
    startTime3 = System.currentTimeMillis();
    System.out.println("time here is--after-while : " +startTime3);
                   out.close();//added by rosmon to check extra time taken for extraction//
    bufferWrt.close();
    fileWriter.close();
    System.out.println("After -3");
    return extractFileName;
    catch(Exception e){
                   e.printStackTrace();
    throw new InternalServerException(e.getMessage());

    Hi fiontan,
    Thanks a lot for the response!!!
    Yeah!! I kow it's a lotta code, but i thought it'd be more informative if the whole function was quoted.
    I'm in fact using the PrintWriter to wrap the BufferedWriter but am not using the print() method.
    Does it save any time by using the print() method??
    The place where the delay is occurring is the wile loop shown below:
                while(allitems.hasMoreElements()){
                String aRow = "";
                    XDItem item = (XDItem)allitems.nextElement();
                    for(int i =0 ; i < props.length; i++){
                         String value = item.getStringValue(props);
         if(value == null || value.equalsIgnoreCase("null"))
              value = "";
                             if(i == 0)
                                  aRow = value;
                             else
                                  aRow += ("\t" + value);
    startTime1 = System.currentTimeMillis();
    System.out.println("time here is--before-writing to buffer --out.flush() done: " +startTime1);
    bufferWrt.write(aRow.toCharArray());
    out.flush();//added by rosmon to check extra time taken for extraction//
    bufferWrt.flush();//added by rosmon to check extra time taken for extraction//
    bufferWrt.newLine();
    startTime2 = System.currentTimeMillis();
    System.out.println("time here is--after-writing to buffer : " +startTime2);
    What exactly happens is that after a few loops it just seems to sleep for around 20 seconds and then again starts off and ............it goes on till the records are done.
    Please do lemme know if you have any idea as to why this is happening !!!!! This bug is giving me the scare.
    thanks in advance

  • Delta Sync taking too much time on refreshing of tables

    Hi,
    I am working on Smart Service Manager 3.0. I have come across a scenario where the delta sync is taking too much time.
    It is required that if we update the stock quantity then the stock should be updated instantaneously.
    To do this we have to refresh 4 stock tables at every sync so that the updated quantity is reflected in the device.
    This is taking a lot of time (3 to 4 min) which is highly unacceptable from user perspective.
    Please could anyone suggest something so that  only those table get refreshed upon which the action is carried out.
    For eg: CTStock table should get refreshed only If i transfer a stock and get updated accordingly
    Not on any other scenario like the changing  status from accept to driving or any thing other than stocks.
    Thanks,
    Star
    Tags edited by: Michael Appleby

    Hi fiontan,
    Thanks a lot for the response!!!
    Yeah!! I kow it's a lotta code, but i thought it'd be more informative if the whole function was quoted.
    I'm in fact using the PrintWriter to wrap the BufferedWriter but am not using the print() method.
    Does it save any time by using the print() method??
    The place where the delay is occurring is the wile loop shown below:
                while(allitems.hasMoreElements()){
                String aRow = "";
                    XDItem item = (XDItem)allitems.nextElement();
                    for(int i =0 ; i < props.length; i++){
                         String value = item.getStringValue(props);
         if(value == null || value.equalsIgnoreCase("null"))
              value = "";
                             if(i == 0)
                                  aRow = value;
                             else
                                  aRow += ("\t" + value);
    startTime1 = System.currentTimeMillis();
    System.out.println("time here is--before-writing to buffer --out.flush() done: " +startTime1);
    bufferWrt.write(aRow.toCharArray());
    out.flush();//added by rosmon to check extra time taken for extraction//
    bufferWrt.flush();//added by rosmon to check extra time taken for extraction//
    bufferWrt.newLine();
    startTime2 = System.currentTimeMillis();
    System.out.println("time here is--after-writing to buffer : " +startTime2);
    What exactly happens is that after a few loops it just seems to sleep for around 20 seconds and then again starts off and ............it goes on till the records are done.
    Please do lemme know if you have any idea as to why this is happening !!!!! This bug is giving me the scare.
    thanks in advance

  • Hi from the last two days my iphone have very slow to open the apps and very slow when i check the notification window , it taking too much time to open when i tap to down . help me to resolve the issue.

    Hi,  from the last two days my iphone( iphone 4 with ios 5) have very slow to open the apps and very slow when i check the notification window , it taking too much time to open when i tap to down . help me to resolve the issue.

    The Basic Troubleshooting Steps are:
    Restart... Reset... Restore...
    iPhone Reset
    http://support.apple.com/kb/ht1430
    Try this First... You will Not Lose Any Data...
    Turn the Phone Off...
    Press and Hold the Sleep/Wake Button and the Home Button at the Same Time...
    Wait for the Apple logo to Appear and then Disappear...
    Usually takes about 15 - 20 Seconds... ( But can take Longer...)
    Release the Buttons...
    Turn the Phone On...
    If that does not help... See Here:
    Backing up, Updating and Restoring
    http://support.apple.com/kb/HT1414

  • Taking too much time in Rules(DTP Schedule run)

    Hi,
    I am Scheduling the DTP which have filters to minimize the load data.
    when i run the DTP it is taking too much time in the "rules" (i can see the  DTP monitor ststus package by pakage and step by step like "Start routine" "rules" and "End Routine")
    here it is consuming too much time in Rules Mapping.
    what is the problem and any solutions please...
    regards,
    sree

    Hi,
    Time taken at "rules" depends on the complexity involved there in ur routine. If it is a complex calculation it will take time.
    Also check ur DTP batch settings, ie how many no. of background processes used to perform  DTP, Job class.
    U can find these :
    goto DTP, select goto menu and select "Settings for Batch Manager".
    In the screen increase no of Processes from 3 to higher no(max 9).
    ChaNGE job class to 'A'.
    If ur DTP is still running , cancel it ie Kill the DTP, delete from the Cube,
    Change these settings and run ur DTP one more time.
    U can observer the difference.
    Reddy

  • I having issue with my Iphone 4 while playing music its taking too much time to play

    I am using Iphone which is taking too much time to play music & some time its shows one album cover and playing others song please help and let me know whats the issue

    Hello Sanjay,
    I would recommend steps 1, 3, and 5 from our iPhone Troubleshooting Assistant found here: http://www.apple.com/support/iphone/assistant/phone/#section_1
    Here is step 1 to get you started.
    Restart iPhone
    To restart iPhone, first turn iPhone off by pressing and holding the Sleep/Wake button until a red slider appears. Slide your finger across the slider and iPhone will turn off after a few moments.
    Next, turn iPhone on by pressing and holding the Sleep/Wake button until the Apple logo appears.
    Is iPhone not responding? To reset iPhone, press and hold the Sleep/Wake button and the Home button at the same time for at least 10 seconds, until the Apple logo appears.
    If your device does not turn on or displays a red battery icon, try recharging next.
    Take care,
    Sterling

  • R3load ttaking too much time when table REPOSRC is loaded

    Hello,
    I am installing the SAP ECC 6.0 SR2 on SUN Solaris 10 on DB2 V9.1.  17 jobs of the 19 have been completed in ABAP Import phase but it is taking too much time while doing this SAPSSEXC. It is running aroung 10 hours.  There is no error is giving. So I have canceled the SAP Installation. After that I have started through manual OS command
    /sapmnt/<SID>/exe/R3load -dbcodepage 4102 -i /<instdir>/SAPSSEXC.cmd -l /<instdir>/SAPSSEXC.log -stop_on_error -merge_bck
    It is also running around 9 hours. I do not why it is happening and when it will be completed.
    Can you help me what will check for doing this job fast or can help me how to resolve this issue?
    I have checked these SAP notes 454368 and 455195
    If i change any DB2 parameter, I have to restart the DB2 Database. What will I do? I can not understand what to do now.
    Please help me ASAP.
    Thanks
    Gautam Poddar

    Hello,
    running the R3load import step manually you might try to add the option
    -loadprocedure fast LOAD
    Pay attention to write the LOAD in capital letters!
    This will invoke the LOAD-API whenever possible.
    This should save some time on "normal" tables without LOB-columns.
    For your table REPOSRC that has a BLOB column the LOAD-API will not be taken.
    So I am sorry this will not work for your special case.
    (Thanks for the hint to Frank-Martin Haas)
    Kind regards
    Waldemar Gaida
    Edited by: Waldemar Gaida on Jan 10, 2008 8:26 AM

  • Import taking too much time

    Hi all
    I'm quite new to database administration.my problem is that i'm trying to import dump file but one of the table taking too much time to import .
    Description::
    1 Export taken from source database which is in oracle 8i character set is WE8ISO8859P1
    2 I am taking import in 10 g with character set utf 8 and national character set is also same.
    3 dump file is about 1.5 gb.
    4 I got error like value is too large for column so in target db which is in utf 8 i convert all coloumn from varchar2 to char.
    5 while taking a import some table get import very fast bt at perticular table it get very slow
    please help me thanks in advance.......

    Hello,
    4 I got error like value is too large for column so in target db which is in utf 8 i convert all coloumn from varchar2 to char.
    5 while taking a import some table get import very fast bt at perticular table it get very slow For the point *4* it's typically due to the CHARACTER SET conversion.
    You export data in WE8ISO8859P1 and import in UTF8. In WE8ISO8859P1 characters are encoded in *1 Byte* so *1 CHAR = 1 BYTE*. In UTF8 (Unicode) characters are encoded in up to *4 Bytes* so *1 CHAR > 1 BYTE*.
    For this reason you'll have to modify the length of your CHAR or VARCHAR2 Columns, or add the CHAR option (by default it's BYTE) in the column datatype definition of the Tables. For instance:
    VARCHAR2(100 CHAR)The NLS_LENGTH_SEMANTICS parameter may be used also but it's not very well managed by export/Import.
    So, I suggest you this:
    1. set NLS_LENGTH_SEMANTICS=CHAR on your target database and restart the database.
    2. Create from a script all your Tables (empty) on the target database (without the indexes and constraints).
    3. Import the datas to the Tables.
    4. Import the Indexes and constraints.You'll have more information on the following Note of MOS:
    Examples and limits of BYTE and CHAR semantics usage (NLS_LENGTH_SEMANTICS) [ID 144808.1]For the point *5* it may be due to the conversion problem you are experiencing, it may also due to some special datatype like LONG.
    Else, I have a question, why do you choose UTF8 on your Target database and not AL32UTF8 ?
    AL32UTF8 is recommended for Unicode uses.
    Hope this help.
    Best regards,
    Jean-Valentin

  • Still the report is taking too much time

    Hi All,
    When i refresh a webi report the report is taken too much time to refresh(open).
    In back end i have checked all the connections, contexts, cardinalities, joins, conditions...etc, in webi i have enabled the the check box 'query stripping'.
    but still the report is taking too much time, i didn't identified the problem?
    Please help me on this.
    Thanks in advance..

    Hi Mark,
    How many queries are there--2
    How many rows are returned--- 2000+
    Are all measures defined with aggregates-- Yes
    What is the array fetch size? (I1000 if it isn't already)

  • Session bean takes too much time with multiple clients

    Hi ,
    We have a problem in our system.
    we have a session bean on Weblogic 6.0 which internally accesses a DAO
    to get data from an Oracle Database .The sessionbean does a bit of
    heavy processing with the data and then returns an Arraylist of
    required objects .
    with a single client running on the system the whole thing takes about
    3 seconds but if three clients access the same functionality the total
    time taken rises up to a big 50 seconds for all the clients .
    we do not have a problem with the connection pool, all connections
    are closed properly .
    anyone who has faced the same problem ??
    what could be the reason for this anormous increase in time??
    Thanks
    Amol Godbole

    Hi ,
    narrowed down the problem to resultset looping taking too much time .!!!
    any ideas why this might be happening ?
    Thanks
    Amol Godbole
    [email protected] (Amol Godbole) wrote in message news:<[email protected]>...
    Hi ,
    We have a problem in our system.
    we have a session bean on Weblogic 6.0 which internally accesses a DAO
    to get data from an Oracle Database .The sessionbean does a bit of
    heavy processing with the data and then returns an Arraylist of
    required objects .
    with a single client running on the system the whole thing takes about
    3 seconds but if three clients access the same functionality the total
    time taken rises up to a big 50 seconds for all the clients .
    we do not have a problem with the connection pool, all connections
    are closed properly .
    anyone who has faced the same problem ??
    what could be the reason for this anormous increase in time??
    Thanks
    Amol Godbole

  • BAPI Function taking too much time

    Hi,
    One BAPI function BAPI_INCOMINGINVOICE_CREATE
    taking too much time to process the data. do we have any alternative to reduce the time taken by the BAPI function ?
    like by using any SAP Notes or any other alternative function or by any other way.
    Thanks and regards,
    Shailendra

    HI,
    Please apply the OSS Note 830717. I applied this note and ran transaction code FBZ0 and the "withholding tax code" changed from XX to 03.
    Reward if helpful

  • Reporting time taking too much time

    Hi Experts,
    I created one query one of my data target and i executed query, it takes too much time, can any one tell me how can i resolve this issue? please help me out?
    Thanks in advace
    David

    Check if the IC data is compressed or not?Compression will increase the performance of the query.
    Check for the stats info using ST03 and analyse what need to be done .creating aggregates etc..
    Use ST03N -> BW System load values to recognize the problem. Use the number given in table 'Reporting - InfoCubes:Share of total time (s)' to check if one of the columns %OLAP, %DB, %Frontend shows a high number in all InfoCubes.
    You need to run ST03N in expert mode to get these values
    based on the analysis and the values taken from the above - Check if an aggregate is suitable or setting OLAP etc.
    Check the read mode for the query. recommended is H.
    If the Query is built on the top of MP then
    -   including characteristic 0INFOPROV when you design the query for the MultiProvider so that you can filter by InfoProvider.
    - Do not use more than 10 InfoProviders when you define a MultiProvider. If a number of InfoProviders are being read by a query at the same time, it can take a while for the system to calculate the results.

Maybe you are looking for