Max-shm-memory Problem - out o memoy. No space on device

Hi Everyone,
First time post.  I'm a UNIX SA tying to troubleshhot the problem:  On Solaris 10.
SQL> startup pfile=inittest1.ora
ORA-27102: out of memory
Solaris-AMD64 Error: 28: No space left on device
SQL>
u01/app/oracle/admin/dd00lod1/udump/dd00lod1_ora_25782.trc
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORACLE_HOME = /u01/app/oracle/product/10.2.0.4
System name:    SunOS
Node name:      reuxeuux1089
Release:        5.10
Version:        Generic_147441-10
Machine:        i86pc
Instance name: dd00lod1
Redo thread mounted by this instance: 0 <none>
Oracle process number: 0
Unix process pid: 25782, image: oracle@reuxeuux1089
skgm warning: ENOSPC creating segment of size 0000000005800000
fix shm parameters in /etc/system or equivalent
We have tied modifying the max-shm-memory settings, but no joy !  Please assist if you can.
Thanks
Amreek
prctl -n project.max-shm-memory -i project 100
project: 100: ORACLE
NAME    PRIVILEGE       VALUE    FLAG   ACTION                       RECIPIENT
project.max-shm-memory
        privileged       124GB      -   deny                                 -
        system          16.0EB    max   deny

consider  to Read The Fine Manual; INSTALLATION GUIDE

Similar Messages

  • Shminfo_shmmax in /etc/system does not match  project.max-shm-memory

    If I specified 'shminfo_shmmax' in /etc/system and hava the system default in /etc/project(no change is made), the size of 'project.max-shm-memory' is 10 times larger than 'shminfo_shmmax'.
    #more /etc/system // (16MB)
    set shmsys:shminfo_shmmax=16000000
    #prctl -n "project.max-shm-memory" -i project user.root
    => will display like below.
    project: 1: user.root
    NAME PRIVILEGE VALUE FLAG ACTION RECIPIENT
    project.max-shm-memory
    privileged 1.49GB - deny -
    system 16.0EB max deny
    1.49GB is 10 times larger than 'SHMMAX'. If I add more entries /etc/system like below, max_shm_memory will become even larger.
    #more /etc/system
    set shmsys:shminfo_shmmax=16000000
    set semsys:seminfo_semmni=2000
    set shmsys:shminfo_shmmni=2000
    set msgsys:msginfo_msgmni=2048
    After I reboot with the above /etc/system and no change /etc/project(all default, no values added)
    # prctl -n "project.max-shm-memory" -i project user.root
    project: 1: user.root
    NAME PRIVILEGE VALUE FLAG ACTION RECIPIENT
    project.max-shm-memory
    privileged 29.8GB - deny -
    system 16.0EB max deny -
    Can anyone shed light about this area how to configure SHMAX in /etc/system right?

    We saw similar behavior and opened a case with Sun.
    The problem turns out to be that the (deprecated) /etc/system to (new) project resource limits isn't always one-to-one.
    For example process.max-shm-memory gets set to shmsys:shminfo_shmmax * shmsys:shminfo_shmmni.
    The logic here is that under the /etc/system tunings you might have wanted the maximum number of segments of the maximum size so the system has to be able to handle that. Make sense to some degree. I think Sun updated one of their info docs on the process at the end of our case to make this clearer.

  • Impact of project.max-shm-memory configuration in Solaris 10

    Dear All,
    I'm not sure if this an error or purposely configured as it is.
    Current kernel configuration of project.max-shm-memory is *400Gb* while the hardware only have 8 GB RAM, and SGA_max set to 5GB (ORACLE database is 10g).
    Will there be any impact in long run with this configuration based on your experiences?
    project: 1: user.root
    NAME PRIVILEGE VALUE FLAG ACTION RECIPIENT
    project.max-shm-memory
    privileged *400GB* - deny -
    system 16.0EB max deny -
    Suggestions and advices are much appreciated.
    Thanks and Best Regards,
    Eric Purwoko

    Hi Helios,
    Thanks!, the recommendation is 4294967295, but my SGA MAX and target is 5 GB. Will it cause problem if I put project.max-shm-memory lower than SGA?
    Thanks for the link too. I guess I better put those configuration in /etc/system too.
    But now wondering what's the best value looking at my SGA max configuraiton
    Best Regards,
    Eric

  • How to set kernel parameter max-shm-memory automatically at startup

    Hi,
    We have a 11.1.0.7 Database on Solaris Sparc 10 64-bit server. We have settings of max-shm-memory as below;
    -bash-3.00# prctl -n project.max-shm-memory -i project default
    project: 3: default
    NAME    PRIVILEGE       VALUE    FLAG   ACTION                       RECIPIENT
    project.max-shm-memory
            privileged      50.0GB      -   deny                                 -
            system          16.0EB    max   deny                                 -
    -bash-3.00# prctl -n project.max-shm-memory -i project system
    project: 0: system
    NAME    PRIVILEGE       VALUE    FLAG   ACTION                       RECIPIENT
    project.max-shm-memory
            privileged      50.0GB      -   deny                                 -
            system          16.0EB    max   deny                                 -
    Whenever we restart the db the second one is lost;
    bash-3.00$ prctl -n project.max-shm-memory -i project default
    project: 3: default
    NAME    PRIVILEGE       VALUE    FLAG   ACTION                       RECIPIENT
    project.max-shm-memory
            privileged      50.0GB      -   deny                                 -
            system          16.0EB    max   deny                                 -
    bash-3.00$ prctl -n project.max-shm-memory -i project system
    prctl: system: No controllable process found in task, project, or zone.
    So our sys admin has to configure them again whenever we restart our db. How could I do this automatically at startup without counfiguring again from command prompt?
    Thanks,
    Hatice

    Ok it is clear now. I have one more question.  When I check system I get below error;
    # prctl -n project.max-shm-memory -i project system
    prctl: system: No controllable process found in task, project, or zone.
    Document says; The reason for the message reported above is because there is no active process(es) belong to the project.
    But it is impossible for us because according to our project settings its for root user;
    bash-3.00$ cat /etc/project
    system:0::root::project.max-shm-memory=(priv,53687091200,deny)
    user.root:1::::
    noproject:2::::
    default:3::oracle::project.max-shm-memory=(priv,53687091200,deny)
    group.staff:10::::
    oracle:100::::project.max-shm-memory=(priv,53687091200,deny)
    Is it because I check with oracle user and I don't have sufficient privileges or there is something wrong with it?
    Thanks.

  • Max-shm-memory - definition

    Hi Guys,
    I'm trying to get a clear definition on project.max-shm-memory. Lets say its set to 4GB for user.oracle. Does this mean that the maximum amount of shared memory available to the project is 4GB or that the maximum shared memory segment size created by ANY process in the project can be 4GB (i.e 2 processes could create 2 separate 4GB segments)? I'm pretty sure its the former but I wanted to check..
    Thanks,
    Tony

    Even though SUN says many of the kernel tunables are now obsolete in /etc/sytem, some like shmmax actually will still work if reset with the global zone. The default is 1/4 system memory.

  • Prctl -n project.max-shm-memory -i process $$

    Hi all,
    when i execute the following command "prctl -n project.max-shm-memory -i process $$"
    its output is
    NAME PRIVILEGE VALUE FLAG ACTION RECIPIENT
    project.max-shm-memory
    privileged 5.85TB - deny
    5.85TB while RAM is only 32GB.
    How i can change it for oracle user.

    What does your /etc/project file says?
    Mine is (showing oracle user):
    oracle:100::oracle::process.max-sem-nsems=(priv,300,deny);project.max-sem-ids=(priv,100,deny);project.max-shm-ids=(priv,512,deny);project.max-shm-memory=(priv,8589934592,deny)
    That is 8 GB RAM allowed for oracle use (max-shm-memory).
    Change it using
    projmod -sK "project.max-shm-memory=(priv,8G,deny)" oracleJan

  • Max-shm-memory (Solaris 11.1)

    I wonder if someone can give a definitive answer on this, or point me to somewhere that does?
    If I have a 64GB RAM server running Solaris 11.1 that will run a single Oracle instance, what is the correct setting to make for max-shm-memory?
    Should it be 64GB, or something a bit smaller? Or something a lot smaller?
    I have read the installation documentation, but it gives examples of 2gb, 4gb and so on. They don't seem relevant to a 64GB+ server

    Thank you, but that document doesn't answer my questions.
    Specifically, it states at one point that "project.max-shm-memory=(privileged,51539607552,deny); ... sets a limit of 48GB per shared memory segment" (which is true, as far as I understand it). But it then goes on to say in the next breath that "The project.max-shm-memory limit is the __total__ shared memory size for your project. -- ie maximum total of all your segments.". Which, to my mind, contradicts its first statement.
    The article then also goes on to give an example where "This system has 8GB of memory..." and shows the author setting "projmod -s -K "project.max-shm-memory=(privileged,4GB,deny)" 'user.oracle'"... so are we to deduce that you should set max-shm-memory to 50% of your physically-available RAM? Or not??
    I had actually read this before I posted. It's the same sort of document I see over and over on this subject: lots of examples, some contradictory, but none stating what principles should govern the setting of max-shm-memory, and none stating what the consequences of (for example) allocating 100% of physically available RAM as max-shm-memory would be.
    So thank you for the reference, but my question still stands: can someone provide a definitive answer on what the setting here should be? It's a 64GB server and will run nothing but Oracle database, with a single instance. max-shm-memory should be set to.... what, exactly?

  • 648 MAX (MS6585) : memory problem

    Hello,  
    On my MSI 648MAX mainboard,  
    I try to place three memory modules (PC2700 512Mb DDR/CL2.5   TwinMos)  
    With two modules (1Gb RAM), windows 2000 server doesn't crash.  
    With the third (1.5 Gb RAM), windows 2000 server always crashes.  
    I tried the modules one by one and I don't have any problems.  
    But with the three modules, windows craches uncertain way.
    What can I do ?
    Config :
    Pentium 4 2.4GHZ (on MSI 648MAX (MS-6585))
    1,5 GB ram (PC2700 512Mb DDR/CL2.5   TwinMos)
    2 Hard disk : IBM 40Gb Ultra ATA 100
    ATI Radeon 7000 32Mb
    Network card : 3com
    Win2000 Server SP3
    All drivers are updated
    Thanks
    Greg

    Greg, did you ever resolve this?  I have a similar problem
    Machine has been running with a 512 Mb/333 MHz module in slot 1 for 2 years with no problems.  I want to increase the memory size.  Tried adding all sorts of combinations of other modules, but won't boot.  I went out and purchased a compatible Kingston 1 Gb/333 MHz CAS 2.5, compatible according to Kingston site.  The memory counts to 1 Gb, and everything POST seems to work. I get a beep, and then nothing.  Trying different slots, the same problem.  The 512 Mb works in all slots, but not the 1 Gb. 
    What is the fix for this MSI memory upgrade issue?

  • Memory problem, but there is enough space - virus? pls help!

    Since yesterday I am unable to download anyfile, I always get a message saying that there is not enough free disk space, but I happen to have 3GB space still left. I can't save any textedit document, it takes forever to reboot and when I connect my usb external harddrive it is not recognized. moreover skype refuses to start. The internet also got slower and when I have more windows open the task manager shows up and asks me to close down some applications. Strange thing I refused to update last week when the message poped up and when I opened the update today it said that there was nothing to upadate.
    What is happening? Is it a RAM problem? or have I got a virus on the board? I really need to work on my computer and seems to me, that I can't eve backup now. I am very grateful for any suggestions.

    Your Mac needs adequate hard drive space to operate normally. How full can a drive be before it's too full? There is no hard and fast rule that says “X” amount or “%” of free drive space is needed. A low amount of RAM requires more drive space for Virtual Memory’s swap files. As a general rule, your available space should be 5GB as an absolute minimum as it generally requires that much free space to perform an Archive and Install of Mac OS X and still preserve some free space for VM swap files. How much RAM do you have? With only 3GB of free space, you may have corrupted some of the files on your hard drive.
    Problems from insufficient RAM and free hard disk space are discussed in this link
    http://www.thexlab.com/faqs/lackofram.html
    Look at these links.
    Freeing space on your Mac OS X startup disk
    http://www.thexlab.com/faqs/freeingspace.html
    Amazing Disappearing Drive Space
    http://www.pinkmutant.com/articles/TigerMisc.html
    Increase HD Free Space[/b] by using Monolingual 
    http://monolingual.sourceforge.net/
    How to free up my disk space
    http://www.macmaps.com/diskfull.html
    Where Did My Disk Space Go?
    http://www.macfixitforums.com/ubbthreads.php/ubb/showflat/Number/770243/site_id/ 1
     Cheers, Tom

  • Help with 655 Max and Possible Memory Problems...

    Just recieved my 655 Max and have put my system together, but when I turn it on it makes a constant beeping sound. I tried taking out both memory sticks, lol, and turning it on and the sound stops, I tried putting in one memory stick and the sound starts again.
    I suppose this is a memory problem.
    by the way I am using Corssair 256mb CMX256a-3200C2 memory.

    [The alarm that you are getting is the CPU Fan Alarm you will have to diable it in the Bios. I am using a Vantec Aeroflow  heatsink and my CPU alarm also is constantly on and was very annoying till I disabled it. My system still doesn't boot to XP yet as I have RMA'd bothe the MB and Ram.   QUOTE]Originally posted by TritonB7
    Hey Kaiguy the comp does continue to make sounds till I shut the system off. And I'm not sure how a standard pc tone sound would sound like, :(. But it does make beep followed by a lower pitched beep, and it repeats this till I shutdown the system.
    (1) I cannot Access Bios.
    (2) I have two sticks of Corsair 256mb CMX256a-3200C2 memory. I do not have any other ram available that will be compatible with this motherboard. Ive tried using one stick at a time in each of the slots. No Luck.
    I highly think that both sticks not working is very unreal.
    (3) and reseting bios by the jumper isnt working either.
    I can probably try buying some diffrent memory, or returning the whole mobo and exchanging it for the same one.
    By the way on my monitor when i turn the comp on it says 8x Extreme or something close to it while the comp is beeping. I doubt that, that has anything to do with it. This is my first MSI board :( , I've always been an Asus person but Im still open for MSI i'm sure I just recieved a faulty board.
    [/QUOTE]

  • Out of memory problem using the API

    Hi all,
    I need your assistance, we are working with CDB 10.2 making searches and retrieving the documents with all their attributes.
    In our actual scenario we have a single user (which represents an application) accessing CDB. This user use several persistent sessions simultaneously. I mean, several thousands of final users connect to an application that uses one user of CDB to connect to CDB with several persistent sessions.
    To simulate this scenario we wrote java code that open five threads and make several searches (requesting all the attributes) using the same user on cdb.
    Retrieving a considerable amount of data found on the search (~5000), we found a “Out of memory” problem when we made these tests:
    -     5 threads obtaining 100 documents (and all their attributes) / search
    -     1 thread obtaining 500 documents (and all their attributes) / search
    -     Also we have same problem if we make several searches with less results
    We suppose it’s a configuration or code issue so we ask for your assistance and experience to solve it.
    Thanks for your help,
    Dani
    import java.sql.Connection;
    import oracle.ifs.examples.api.constants.AttributeRequests;
    import oracle.ifs.examples.api.util.CommonUtils;
    import oracle.ifs.fdk.Attributes;
    import oracle.ifs.fdk.ClientUtils;
    import oracle.ifs.fdk.FdkConstants;
    import oracle.ifs.fdk.FdkCredential;
    import oracle.ifs.fdk.ManagersFactory;
    import oracle.ifs.fdk.NamedValue;
    import oracle.ifs.fdk.Options;
    import oracle.ifs.fdk.SearchExpression;
    import oracle.ifs.fdk.SearchManager;
    import oracle.ifs.fdk.SimpleFdkCredential;
    import oracle.jdbc.pool.OracleDataSource;
    public class Prueba {
         public static void main(String args[]) {
              Thread thread = new BasicThread1();
              Thread thread1 = new BasicThread1();
              Thread thread2 = new BasicThread1();
              Thread thread3 = new BasicThread1();
              Thread thread4 = new BasicThread1();
              thread.start();
              thread1.start();
              thread2.start();
              thread3.start();
              thread4.start();
    class BasicThread1 extends Thread {
         public void run() {
              ManagersFactory session = null;
              try {
                   System.out.println(this.getName() + "-->init");
                   session = getSession();
                   SearchManager sManager = session.getSearchManager();
                   SearchExpression srchExpr = new SearchExpression(Attributes.SIZE,
                             new Integer(20000000), FdkConstants.OPERATOR_LESS_THAN);
                   NamedValue[] res = null;
                   for (int i = 0; i < 100000; i++) {
                        res = sManager.search(srchExpr, basicSearchOptions2,
                                  AttributeRequests.DOCUMENT_CATEGORY_ATTRIBUTES);
                   System.out.println(this.getName()+" --> fin sin error: " + res.length);
                   } catch (Throwable t) {
    t.printStackTrace();
    System.out.println("<--"+this.getName());
              } finally {
                   CommonUtils.bestEffortLogout(session);
         static NamedValue[] basicSearchOptions2 = new NamedValue[] {
                   ClientUtils.newNamedValue(
                                       Options.MULTILEVEL_FOLDER_RESTRICTION,
                                       Boolean.TRUE),
                   ClientUtils.newNamedValue(Options.SEARCH_FOR_DOCUMENTS,
                             Boolean.TRUE),
                   ClientUtils.newNamedValue(Options.SEARCH_FOR_FOLDERS,
                             Boolean.FALSE),
                   ClientUtils.newNamedValue(Options.RETURN_COUNT,
                             new Integer(500)) //<<Maximo nº de elementos
         private static ManagersFactory getSession() throws Exception {
         OracleDataSource ods = new OracleDataSource();
         ods.setURL("URL");
         Connection conn = ods.getConnection();
         FdkCredential credential = new SimpleFdkCredential("USER","PSW");
         ManagersFactory session = ManagersFactory.login(credential,
         "SERVER");
    return session;
    }

    re-Post

  • Flex 4 RichEditableText out of memory problems

    Hello, we're conducting performance testing on the UI of a productivity application we're developing using Adobe AIR. We are using Flash Builder 4 public beta. One part of the performance test is updating the textFlow property of a RichEditableText control every 5 seconds with random data that contains several paragraphs and several images (to mimic a typical news article). We used TextFlowUtil.importFromString(str) to convert the raw data to a textFlow object.
    We found that the above test with the RichEditableText control would quickly crash the AIR runtime with presumably out-of-memory problems. If we switch to using the (read-only) RichText control, the problem went away. The only material difference between the two controls we could find in the documentation is that the RichEditableText control supports unlimited undo/redos as long as it retains focus. Could this be the source of the memory problem? Regardless, can its behavior be modified to avoid the out-of-memory problems?
    The use case we are trying to simulate is having the productivity app open for weeks at a time, with constant editing and going back and forth between different workflows.
    Thanks for your help, and please let me know if you need additional information

    Try a more recent build.
    Alex Harui
    Flex SDK Developer
    Adobe Systems Inc.
    Blog: http://blogs.adobe.com/aharui

  • MS-6380 Memory Problem

    I have a MS-6380E with the latest BIOS and chipset drivers. Athlon 1.5 Ghz processor, Geo4 Ti series card with 64MB ram and latest drivers and an SBLive Platinum 5.1 card, with the latest drivers. Windows XP Pro is the operating system. The system has 2 PC2100 256MB sticks of DDR ram on the board. This past weekend I tried to install a 3rd chip, PC2100 512MB chip to bring the system up to a Gig of ram. The system would reboot immediatly upon logging into Windows. I took the memory back and exchanged it for another. Installed it with similar results. I tried all three slots with the new memory and the two existing. No, I didn't try the new memory by itself. I finally go it to appear stable for a little bit but I also play Dark Age of Camelot and the latest symptoms were that the system would kick me back to the desktop as soon as I got to the initial "connecting" screen in Dark Age. I also had other odd behaviour with the additional ram installed like my modem taking longer to validate my connection. When I went back to the pre-existing configuration 512MB total, everything would stabilize. The DDR brand was Centron, I know it's not the same manufacturor as my other two sticks but I don't recall the brand on those.  Also, just in case you needed to know Dark Age of Camelot is an MMORPG. I checked their tech section and nothing on this problem and their game. I've since returned the memory and have resigned to maxing out at 512MB tho the system specs say I can go to 3GB. I really like this board but am frustrated at the memory problem.

    http://www.msi.com.tw/program/service/forum/search.php?boardid=
    6380 covers a hell of a lot of boards
    getting same make and model of ram is generally advisable to

  • Powerbook G4 freezes on me - memory problems..?

    Hi all,
    I have just bought a new Powerbook G4 15inch as i read all sorts of varying articles on Macbook Pro's teething problems with Adobe etc so decided to go for the tried and trusted good old Powerbook.
    So i got the retailer to boost it from 512mg to 2gb in the shop and after giving it a good workout (installing Illustrator, photoshop, microsoft etc) and playing with Itunes i find that it freezes on me every 15mins or so..! I'm told it could be a memory problem (maybe the 2 new 1gbs are not compatible). Or should i reinstall Tiger..?
    Any advice is welcome.. help..!
    G4 powerbook 15inch   Mac OS X (10.4.2)   1.67GHZ 2gigabytes 100GB

    Hello, Shane,
    No, I've not had trouble with freezes on my powerbook. I did have a little trouble clicking in the RAM on the top slot properly, both on my PB and on a friend's, but in these cases, the PB beeped three times on startup (RAM "distress signal").
    It has been my understanding that system freezes are often caused by faulty hardware--RAM, a peripheral device that is misbehaving, or a more serious problem such as logic board.
    Running the hardware test in extended mode would help with this.
    Some people have reported that they have solved system freezes when all the hardware checks out by doing an archive/installation of their operating system, or erasing the drive and re-installing from scratch.
    Good luck.
    Maxit
    Message was edited by: Maxit
    I also would be remiss in not pointing out that since your Powerbook is new, Apple is available to help you troubleshoot, and in case there is a serious hardware issue, making an early report is advisable.

  • Memory Problems with Adobe PDF iFilter for 64-bit

    In preparation to rebuild my Windows Search Index, I installed the Adobe PDF iFilter for 64-bit on my system (Vista Business 64).  When I finally rebuilt the index, I wasn't too surprised by what I saw happen, namely, the SearchFilter.exe process would kick in whenever I wasn't using the system and just eat RAM.  One time I turned it on and it had allocated over 4,000 MB (and my system only has 4,030 MB available) so of course it was forcing all the other processes to hard fault (ie. everything was moving like molasses--for example, it took 20 minutes to put the thing to sleep).  But I just let it do it's work, figuring that perhaps this was to be expected relative to the small library of PDF's that I've accumulated on my computer, ranging from LaTeX generated text files, to containers for hi-res scans.  So, after a day and a half of basically not using my laptop, everything finally calmed down and I enjoyed the benefits of searching the content of my library from the Windows Start menu--for a short while.
    However, to my dismay I've encountered the problem that this freezing of my computer would now occur after everytime I download a new PDF (in this particular case they were Google Books scans) and then left the computer to idle.  Again, the SearchFilter.exe would allocate all of my RAM for itself and just push everything else onto the Virtual RAM, which means the SLOWEST possibly fetching you can get.  I had to uninstall as this was making my computer unusable for 15-30 minutes after each idle. Everything is back in working order without the iFilter, but I would like to know if anyone has reported such problems on x64 systems.  Obviously, I will also report the problem to Microsoft, since the search engine should certainly have the precaution to handle such memory problems.   However, it is a problem that is created by the Adobe PDF iFilter interacting with the Windows Search engine.

    Hello,
    We believe we have figured this out.  It looks like it has to do with the length of the default folder location for the Adobe iFilter.
    I was able to reproduce the issue and the following resolved it for me.  See if this resolves it for you all as well.
    Here is how to get Adobe Version 11 PDF filter to work.
     1 . If you haven’t already, run the following in SQL Server:
    Sp_fulltext_service ‘Load_os_resources’, 1
    Go
    --you might also need to run: 
    sp_fulltext_service ‘Verify_signature’,0  --This is used to validate trusted iFilters. 0 disables it. So use with caution.
    --go
    2. Stop SQL Server.  (Make sure FDHost.exe stops)
    3.  
    Uninstall the Adobe ifilter (because it defaulted to having spaces or the folder name is too long).
    4.  
    Reinstall the Adobe iFilter and when it prompts for where to install it, change it to: C:\Program Files\Adobe\PDFiFilter
    5.  Once the installation finishes, go the computer’s Environment variables. Add the following to the PATH.
    C:\Program Files\Adobe\PDFiFilter\BIN
    NOTE: it must include the BIN folder
    NOTE: If you had the OLD location that included spaces, remove it from the path environment variable.
    6. Start SQL Server
    7.  IF you had an existing Full-text index on PDFs, drop the full-text index and recreate it.
    8. You should now get results when you run sys.dm_fts_index_keywords('db','tblname')  --Note: Change db to be the actual database name and tblname to be the actual table name.
     Give this a try and see if this fixes yours. 
    Sincerely,
    Rob Beene, MSFT

Maybe you are looking for

  • ITunes 11 Won't Let Me Add Songs To Library

    I made this a new question because the previous posts were pretty old. https://discussions.apple.com/thread/4562384?start=0&tstart=0 Thats the previous with the most replies that I could find. The options they suggested were : 1) file>add folder to l

  • Movies Almost Famous and Almost Heros error on TV, but avail via iTunes

    When I try to access Almost Heroes or Almost Famous via my TV I get an error that iTunes store is unavailable, yet any other movie can be accessed just fine. I can access the details on both the above movies via iTunes on my laptop, including playin

  • Deleted Email account

    I had to delete an email account. Lost all old mail. I can see it in the Time Machine Library. Can I pull those messages out and plop them into the new account I had to set up?

  • Components need to be installed for AII

    what are the components need to be installed for AII ,to access remotely located AII server. Requirement is for RFID implementation.

  • Is anyone else frustrated?

    Weeks before Christmas the wifi on my cell phone greyed out, so it's no longer usable can't turn it on or off. I went on my boyfriends iPad to Apple.com and found info saying they were trying to fix it and to try resetting the phone, with clear direc