Best Practice - Analyze table with very large partitions

We have a table that contains 100 partitions with about 20m rows in each. Right now the analyze is taking about 1 hour per partition. The table is used for reporting and will have a nightly load of the previous days data.
What would be the best way to analyze this table? Besides using a low value for ESTIMATE and using low GRANULARITY.
Thank You.

Are you suggesting that the table is so big, its not feasible to analyze anymore?I'm suggesting it's not necessary. I think it's highly unlikely that a nightly load is going to change the stats in any meaningful way, unless you are loading millions of rows. the law of diminishing returns has kicked in.
Remember, the standard advice from Oracle is to gather statistics once and then only bother refreshing those stats when we need to. From Metalink note #44961.1:
"Given the 'best' plan is unlikely to change, frequent gathering statistics has no benefit. It does incur costs though."
What you might find useful is to export the stats from your last run before you do the new run (you should do this anyway). Then after the next stats refresh import both sets of stats into dummy schemas and compare them. If the difference is significant then you ought to keep analysing (especially if yours is a DSS or warehousing database). But if they are broadly the same then maybe it's time to stop.
Cheers, APC

Similar Messages

  • Pivot table with very large number of columns

    Hello,
    here is the situation:
    One table that contains raw data; from this table I feed one with extract information (3 fields); I have to turn the content in a pivot table
    Ro --- Co --- Va
    A A 1
    A B 1
    A C 2
    B A 11
    Turned in
    A B C...
    A 1 1 2
    B 11 null null
    To do this I do a query like:
    select r, sum(decode(c,'A',Va) COLA, sum(decode(c,'B',Va) COLB , sum(decode(c,'C',Va) COLC,.... sum(decode(c,'XYZ',Va) COLXYZ from table group by r
    The statement is generated by a script (cfmx) and it works until I reach a query that try to have 672 values for c; which means 672 columns...
    Oracle doesn't like that: ORA-01467: sort key too long
    I like this way has it is getting the result fast.
    I have tried different solution a the CFMX level with for that specific query, I got timeout (query table with loop on co within loop on ro)
    Is there any work around?
    I am using Oracle 9i.
    Tahnk you!

    insert into extracted_data select c, r, v, p from full_data where <specific_clause>
    The values for C are from a query: select disctinct c from extracted_data
    and it is the same for R
    R and C are varchar2(3999)
    I suppose that I can split on the first letter of the C column as:
    SELECT r, low.cola, low.colb, . . ., low.colm,
    high.coln, high.colo, . . ., high.colz
    FROM (SELECT r, SUM(DECODE(c, 'A', va)) cola, . . .
    SUM(DECODE(c, 'M', va)) colm
    FROM table
    WHERE c like 'A%'
    GROUP BY r) Alpha_A,
    (SELECT r, SUM(DECODE(c, 'N', va)) coln, . . .
    SUM(DECODE(c, 'Z', va)) colz
    FROM table
    WHERE c like 'B%'
    GROUP BY r) Alpha_B,
    (SELECT r, SUM(DECODE(c, 'N', va)) coln, . . .
    SUM(DECODE(c, 'Z', va)) colz
    FROM table
    WHERE c like 'C%'
    GROUP BY r) Alpha_C
    (SELECT r, SUM(DECODE(c, 'zN', va)) coln, . . .
    SUM(DECODE(c, 'zZ', va)) colz
    FROM table
    WHERE c like 'Z%'
    GROUP BY r) Alpha_Z
    WHERE alpha_A.r = alpha_B.r and apha_a.r = alpha_C.r ... and alpha_a.r = alpha_z.r
    I will have 27 select statement joined... I have to check if even like that I will not reach the limit within one of the statement select
    "in real life"
    select GRPW.r, GRPW.W0, GRPC.C0, GRPC.C1 from
    (select r, sum(decode(C, 'Wall, unspecified',cases)) W0 from tmp_maqueje where upper(C) like 'W%' group by r) GRPW,
    select r,
    sum(decode(C, 'Ceramic tiles, indoors',cases)) C0,
    sum(decode(C, 'Cement surface, outdoors (Concrete/cement block, see Structural element, A11)',cases)) C1
    from tmp_maqueje where upper(C) like 'C%' group by r) GRPC
    where GRPW.r = GRPC.r
    order by GRPW.r, GRPW.W0, GRPC.C0, GRPC.C1
    Message was edited by:
    maquejp

  • SQL 2012 Best Practice Analyzer issue with nothing available in pulldown on Microsoft Baseline Configuration Analyzer V2.0

    We have tried using both a Windows 7 and a Windows 8 machine and still cannot see any items available in the pulldown (ie. no sql 2012 or anything)  Is this a known issue and BPA does not work for SQL 2012?  Any suggestions?  I've seen several
    posts with the same issue but, no resolution.
    Laura

    Hi Laura,
    I installed Microsoft Baseline Configuration Analyzer 2.0 successfully. I can select a product: SQL Server 2012 BPA. Do you mean this?
    Thanks.
    If you have any feedback on our support, please click
    here.
    Maggie Luo
    TechNet Community Support

  • Best practices for working with large placed bitmap images?

    Hey all,
    I need some advice on the best way to approach building these files. I've been working on some banners that are very large: 3 x 7 feet.
    Each banner has a simple vector graphic treatment at the top and bottom (rectangle with a different colored rule on top, and vector logo) and a small amount of text, just a URL and a headline.The headline is type (not converted to outlines) and usually has some other effect applied to it, say a drop shadow or outer glow. Under these graphics is a full bleed image. The placed images need to be 150ppi at actual size, so they're honking big, sometimes up to 2GB. Once the layouts are approved, they have to go to a vendor for output.
    The Illustrator docs are really large, and I've read in other threads how to combat that (PDF compatibility, raster settings). But even still, does anyone have any insight into the best way to deal with these things? The dimensions are large, and then the images are large, and it just makes for lots of looking at the spinning ball of death...
    If it were me, I'd build them in InDe, but the vector graphics need to be edited for each one, and I so don't like to do that in InDe unless forced. To me, it's still ultimately a page layout app, not a drawing app. (Old school here.)
    FYI, our machines are all MBPs with 8G ram and the latest Intel Core 2 Duo chips, 2.66 and 2.8GHz. If we keep the files local (as opposed to working on the server) it should be fairly zippy... No?
    Any advice is appreciated, thanks!

    You can get into memory trouble with very large placed pdf files. Tiffs too.
    This has to do with the preview, which contains much more information than you need for working with.
    On the other hand if you place EPSs and take care not to turn on overprint preview you can get away with huge files.
    If you do turn on overprint preview your machine will slow down a lot and the file may become totally unmanageable.
    Compare this with to InDesign where you can control the quality of the preview. A hi-res preview will slow you down and most often you don't need it anyway.
    I was working (in Illie) the other day on much larger files than you mention – displays for whole walls – and had some considerable trouble until I reverted to the old EPS format. They say it's dying but it ain't dead yet .

  • How to Create a Table with Merge and partitions in HANA

    Hi,
    What is the best way to create a Table with MERGE and PARTITION and UNLOAD PRIORITIES.
    Any body can you please give me some examples.
    Regards,
    Deva

    Ok,
    1) the UNLOAD PRIORITY has nothing to do with the order of data loads in your ETL process
    2) Unloading of columns will happen automatically. Don't specify anything specific for the tables, then SAP HANA will take care about it
    3) Not sure where you get your ideas from, but there is no need to manually "flush" tables or anything like that. SAP HANA will take care of memory housekeeping.
    4) Partitioning and how to specify it for tables has been largely documented. Just read up on it.
    5) Delta Merge will happen automatically, as long as you don't prevent it (e.g. by trying to outsmart the mergedog rules)
    Seriously, I get the impressions that this list of requirements is based on some hear-say and lack of actual information and experience with SAP HANA. There are a couple of extensive discussions on data loading optimization available here in SCN and on SAPHANA.COM. Please read those first.
    All this had been discussed broadly a couple of times.
    - Lars

  • Best Practice Analyzer database mismatch error

    Hi all,
    I am getting the following critical error when I run the BPA on a couple of our BizTalk servers and wondered if anyone had seen the same?
    "The version of BizTalk Server does not Match the Version of BizTalk Management Database Schemas"
    I am using v1.2 of BPA aagainst a BizTalk 2010 install.
    This has only surfaced since we upgraded from BizTalk 2009 R2 BUT not on all of our environments.
    It does not seem to be causing any runtime issues however as all applications seem to be running fine!!
    Looking at the BizTalkDBVersion tables in SQL everything looks the same on servers which present this error and those that do not i.e. There is an entry for version 3.9.469.0 ... which matches the BizTalk Server version reported in the registry
    at "\HKLM\Software\Microsoft\BizTalk Server\3.0\Product Version\"
    The only thing I can see is that as this was an upgrade there is also an entry in the
    BizTalkDBVersion tables for the 2009R2 version (3.8.368.0), so maybe the BPA is selecting this value and comparing against the regisrty version?]
    However, this doesn't explain why I see this issue on 2 upgraded servers but not the 3rd? 
    Any ideas?
    Regards,
    Dave

    Hi Dave,
    There is no version as BizTalk 2009 R2. v3.8.368.0 refers to BizTalk 2009 (not R2).
    The above error occurred because BizTalk Server Best Practices Analyzer has detected that the version of BizTalk Server does not match the version of the BizTalk Database Schemas. This can happen if the BizTalk database was deleted and then restored
    with an incorrect database.
    Check the version of SQL Server upgraded against the version of BizTalk server.
    Reference BPA Help file:
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Error while Connecting report Best Practices v1.31 with SAP

    Hello experts,
    I'm facing an issue while trying to connect some of my reports from Best Practices for BI with SAP.
    It only happens when it's about info sets, the other ones that are with SAP tables go smoothly without a problem.
    The most interesting is I have already one of the reports connected to SAP info sets.
    I have already verified the document of steps of creation of additional database that comes with BP pack. They seem ok.
    Here goes what Crystal Reports throws to me after changing the data source to SAP:
    For report "GL Statement" one of the Financial Analysis one which uses InfoSet: /KYK/IS_FIGL_I3:
    - Failed to retrieve data from the database; - click ok then...
    - Database connector error: It wasn't indicated any variant for exercise (something like this after translating) - click ok then
    - Database connector error: RFC_INVALID_HANDLE
    For report "Cost Analysis: Planned vs. Actual Order Costs" one of the Financial Analysis one which uses InfoSet: ZBPBI131_INFO_ODVR and ZBPBI131_INFO_COAS; and also the Query CO_OM_OP_20_Q1:
    - Failed to retrieve data from the database; - click ok then...
    - Database connector error: check class for selections raised errors - click ok then
    - Database connector error: RFC_INVALID_HANDLE
    Obs.: Those "Z" infosets are already created in SAP environment.
    The one that works fine is one of the Purchasing Analysis reports:
    - Purchasing Group Analysis -> InfoSet: /KYK/IS_MCE1
    I'm kind of lost to solve this, because I'm not sure if it can be in the SAP JCO or some parameter that was done wrongly in SAP and I have already check possible solutions for both.
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Brazil.

    I re-checked step 3.2.3 - Uploading Crystal User Roles (transaction PFCG) - of the manual where it talks about CRYSTAL_ENTITLEMENT and CRYSTAL_DESIGNER roles, I noticed in the Authorizations tab that the status was saying it hadn't been generated and I had a yellow sign, so then that was what I did (I generated) as it says in the manual.
    Both statuses are now saying "Authorization profile is generated" and the sign is now green on the tab.
    I had another issue in the User tab (it was yellow as Authorizations one before generating)....all I needed to do to change to green was comparing user (User Comparison button).
    After all that, I tried once more to refresh the Crystal report and I still have the error messages being thrown.
    There's one more issue in one of the tabs of PFCG transaction, it is on the Menu one where it is with a red sign, but there's nothing talking about it in the manual. I just have a folder called "Role menu" without anything in it.
    Can it be the reason why I'm facing errors when connecting the report to SAP infoSets? (remember one of my reports which is connected to an infoSet works good)
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Brazil.

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • Exchange Best Practices Analyzer and Event 10009 - DCOM

    We have two Exchange 2010 SP3 RU7 servers on Windows 2008 R2
    In general, they seem to function correctly.
    ExBPA (Best Practices Analyzer) results are fine. Just some entries about drivers being more than two years old (vendor has not supplied newer drivers so we use what we have). Anything else has been verified to be something that can "safely be ignored".
    Test-ServiceHealth, Test-ReplicationHealth and other tests indicate no problems.
    However, when I run the ExBPA, it seems like the server on which I run ExBPA attempts to contact the other using DCOM and this fails.
    Some notes:
    1. Windows Firewall is disabled on both.
    2. Pings in both directions are successful.
    3. DTCPing would not even run so I was not able to test with this.
    4. Connectivity works perfectly otherwise. I can see/manage either server from the other using the EMC or EMS. DAG works fine as far as I can see.
    What's the error message?
    Event 10009, DistributedCOM
    "DCOM was unable to communiate with the computer --- opposite Exchange server of the pair of Exchange servers---  using any of the configured protocols."
    This is in the System Log.
    This happens on both servers and only when I run the ExBPA.
    I understand that ExBPA uses DCOM but cannot see what would be blocking communications.
    I can access the opposite server in MS Management Consoles (MMC).
    Note: the error is NOT in the ExBPA results - but rather in the Event Viewer System Log.
    Yes, it is consistent. Have noticed it for some time now.
    Does anyone have any idea what could be causing this? Since normal Exchange operations are not affected, I'm tempted to ignore it, but I have to do my "due diligence" and inquire. 
    Please mark as helpful if you find my contribution useful or as an answer if it does answer your question. That will encourage me - and others - to take time out to help you.

    Hi David,
    I recommend you refer the following article to troubleshoot this event:
    How to troubleshoot DCOM 10009 error logged in system event
    Why this happens:
    Totally speaking, the reason why DCOM 10009 is logged is that: local RPCSS service can’t reach the remote RPCSS service of remote target server. There are many possibilities which can cause this issue.
    Scenario 1:
     The remote target server happens to be offline for a short time, for example, just for maintenance.
    Scenario 2:
    Both servers are online. However, there RPC communication issue exists between these two servers, for example:  server name resolvation failure, port resources for RPC communication exhaustion, firewall configuration.
    Scenario 3:
    Even though the TCP connection to remote server has no any problem, but if the communication of RPC authentication gets problem, we may get the error status code like 0x80070721 which means “A security package specific
    error occurred” during the communication of RPC authentication, DCOM 10009 will also be logged on the client side.
    Scenario 4:
    The target DCOM |COM+ service failed to be activated due to permission issue. Under this kind of situation, DCOM 10027 will be logged on the server side at the same time.
    Event ID 10009 — COM Remote Service Availability
    Resolve
    Ensure that the remote computer is available
    There is a problem accessing the COM Service on a remote computer. To resolve this problem:
    Ensure that the remote computer is online.
    This problem may be the result of a firewall blocking the connection. For security, COM+ network access is not enabled by default. Check the system to determine whether the firewall is blocking the remote connection.
    Other reasons for the problem might be found in the Extended Remote Procedure Call (RPC) Error information that is available in Event Viewer.
    To perform these procedures, you must have membership in Administrators, or you must have been delegated the appropriate authority.
    Ensure that the remote computer is online
    To verify that the remote computer is online and the computers are communicating over the network:
    Open an elevated Command Prompt window. Click Start, point to
    All Programs, click Accessories, right-click
    Command Prompt, and then click Run as administrator. If the
    User Account Control dialog box appears, confirm that the action it displays is what you want, and then click
    Continue.
    At the command prompt, type ping, followed by a space and the remote computer name, and then press ENTER. For example, to check that your server can communicate over the network with a computer named ContosoWS2008, type
    ping ContosoWS2008, and then press ENTER.
    A successful connection results in a set of replies from the other computer and a set of
    ping statistics.
    Check the firewall settings and enable the firewall exception rule
    To check the firewall settings and enable the firewall exception rule:
    Click Start, and then click Run.
    Type wf.msc, and then click OK. If the
    User Account Control dialog box appears, confirm that the action it displays is what you want, and then click
    Continue.
    In the console tree, click Inbound rules.
    In the list of firewall exception rules, look for COM+ Network Access (DCOM In).
    If the firewall exception rule is not enabled, in the details pane click
    Enable rule, and then scroll horizontally to confirm that the protocol is
    TCP and the LocalPort is 135. Close Windows Firewall with Advanced Security.
    Review available Extended RPC Error information for this event in Event Viewer
    To review available Extended RPC Error information for this event in Event Viewer:
    Click Start, and then click Run.
    Type comexp.msc, and then click OK. If the
    User Account Control dialog box appears, confirm that the action it displays is what you want, and then click
    Continue.
    Under Console Root, expand Event Viewer (Local).
    In the details pane, look for your event in the Summary of Administrative Events, and then double-click the event to open it.
    The Extended RPC Error information that is available for this event is located on the
    Details tab. Expand the available items on the Details tab to review all available information. 
    For more information about Extended RPC Error information and how to interpret it, see Obtaining Extended RPC Error Information (http://go.microsoft.com/fwlink/?LinkId=105593).
    Best regards,
    Niko Cheng
    TechNet Community Support

  • Server Core 2008 R2 SP1 - AD DS Best Practice Analyzer Scans Don't Produce Any Output

    Hi,
    This is a re-post moving this discussion to the recommended forum "Server Core" from here:
    http://social.technet.microsoft.com/Forums/en-US/winservergen/thread/cc33d429-88e0-4450-a73c-361e395fd217.
    I am having problems producing any output for any AD DS Best Practice Analyzer Scans on a Windows Server Core 2008 R2 SP1 Domain Controller.
    I have imported the "ServerManager" and "BestPractices" PS modules on that Server by running the following commands:
    Import-Module ServerManager
    Import-Module BestPractices
    I've then run
    get-BPAModel, to find out what best practice scan models are availale, this returns the following output:
    Id                                                       
    LastScanTime
    Microsoft/Windows/DirectoryServices     Never
    Microsoft/Windows/DNSServer               Never
    I then run all the BPA scans on that box:
    Get-BPAModel | Invoke-BPAModel
    This returns the following output:
    ModelId                                          
    Success  Detail
    Microsoft/Windows/DirectoryServices True       (InvokeBpaModelOutputDetail)
    Microsoft/Windows/DNSServer          True       (InvokeBpaModelOutputDetail)
    Since the BPA invocation results weren’t displayed automatically, I entered the following command to see them:
    Get-BPAModel | Get-BPAResult | Out-File "D:\Source\BPA.txt"
    This command will create a text file with the scan results but I only see the results of the DNSServer scan, not the DirectoryServices scan.
    I have also tried to view the results in a HTML format by running the following command but still only see the DNSServer scan results:
    Get-BPAModel | Get-BPAResult | ConvertTo-Html | Set-Content d:\Source\BPA.htm
    I have also tried exeucuting the scan ONLY for the "Microsoft/Windows/DirectoryServices" model but can't get any results to be returned.  I have also connected using server manager from a Full install of Server 2008 R2 SP1 but that
    doesn't seem to show any results under the "Best Practices Analyzer" section when the "Active Directory Domain Services" node is selected, all 4 tabs ("Noncompliant", "Excluded", "Compliant" and "All") show zero (0).  However, the summary text above the
    tabs does show when the last scan was performed. which seems to be correct.
    Is there something special that needs to be done to produce the BPA results for the "Microsoft/Windows/DirectoryServices" BPA model on Server Core 2008 R2 SP1?
    BTW: The Forest/Domain is W2K3R2 Native, this is the first W2K8R2 DC in the environment and I have installed .NET 4 framework (Server Core) to support Powershell 3, also installed.
    Thanks, Paul.
    belpad

    Hi Diana,
    OK, pretty sure I've now found the root cause of the issue I've described above.
    I was also looking into Windows Update Agent issues for these W2K8R2 Server Core DC's, where no updates would be applied via WSUS (configured via GPO) and would fail with "FATAL: CBS called Error with 0x8000ffff windows update agent server
    core". 
    Yesterday, I managed to get one of the W2K8R2 Server Core DC's (WSUS updates) working again by removing one of the .NET 4 Framework security updates (KB2600211) which was manually applied when the server was initially setup.  .NET 4 (Server Core Edition
    http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=22833) was installed as a pre-requisitie for Powershell 3.  Once this update was removed, the affected server core DC was restarted and WSUS updates started to get applied.
    So I followed the same procedure on the other server core DC but this did not resolve the WSUS issue this time.  Next, I did further investigation into the Windows Update Agent problem.  This led me to the following article:
    http://blogs.technet.com/b/brad_rutkowski/archive/2008/07/03/windows-update-fails-with-8000ffff-e-unexpected.aspx which described an issue with NTFS permissions being set incorrectly on C: drive, with the "BUILTIN\Users" group completely
    missing on the C: drive.
    I found the affected Server Core DC also had this issue and when the "BUILTIN\Users" was assigned permissions on the C: drive as described above, and the Windows Update Agent re-started, the Server Core DC started to install all required updates
    configured via WSUS.
    Next, I ran the Directory Service BPA, which now produces the desired output either locally or remotely via Server Manager.
    Therefore, I can only assume that the Directory Service BPA also uses "Network Service" much like WUAUSERV (Windows Update Agent), which requires access to the C: drive via the "BUILTIN\Users" assignment.
    So this has subsequently led me to check the C: drive (%systemdrive%) permissions across multiple W2K8R2 machines, all of which showed differing assigned permissions, as follows:
    1. W2K8R2 Server Core DC - With Directory Services BPA and Windows Update Agent Not Working
    C:\>icacls c:\
    c:\ BUILTIN\Administrators:(OI)(CI)(F)
        CREATOR OWNER:(OI)(CI)(IO)(F)
        NT AUTHORITY\INTERACTIVE:(OI)(CI)(RX)
        NT AUTHORITY\SYSTEM:(OI)(CI)(F)
    2. W2K8R2 Server Core DC - With Directory Services BPA and Windows Update Agent Working OK
    C:\>icacls c:\
    c:\ NT AUTHORITY\SYSTEM:(OI)(CI)(F)
        BUILTIN\Administrators:(OI)(CI)(F)
        BUILTIN\Users:(OI)(CI)(RX)
        BUILTIN\Users:(CI)(AD)
        BUILTIN\Users:(CI)(IO)(WD)
        CREATOR OWNER:(OI)(CI)(IO)(F)
    3. W2K8R2 Full DC - With Directory Services BPA and Windows Update Agent Working OK
    C:\>icacls c:
    c: NT SERVICE\TrustedInstaller:(F)
       NT SERVICE\TrustedInstaller:(CI)(IO)(F)
       NT AUTHORITY\SYSTEM:(M)
       NT AUTHORITY\SYSTEM:(OI)(CI)(IO)(F)
       BUILTIN\Administrators:(M)
       BUILTIN\Administrators:(OI)(CI)(IO)(F)
       BUILTIN\Users:(RX)
       BUILTIN\Users:(OI)(CI)(IO)(GR,GE)
       CREATOR OWNER:(OI)(CI)(IO)(F)
    4. W2K8R2 Server Core DHCP Server (Migrated from W2K3 with Server Migration Tools) - With DHCP BPA and Windows Update Agent Working OK
    C:\>icacls c:
    c: NT AUTHORITY\SYSTEM:(OI)(CI)(F)
       BUILTIN\Administrators:(OI)(CI)(F)
    5. W2K8R2 Server Core DHCP Server (Migrated from W2K3 with netsh) - With DHCP BPA and Windows Update Agent Working OK
    C:\>icacls c:
    c: NT AUTHORITY\SYSTEM:(OI)(CI)(F)
       BUILTIN\Administrators:(OI)(CI)(F)
       BUILTIN\Users:(OI)(CI)(RX)
       BUILTIN\Users:(CI)(AD)
       BUILTIN\Users:(CI)(IO)(WD)
       CREATOR OWNER:(OI)(CI)(IO)(F)
    None of the above servers have a Group Policy or any in-house scripts defined that configure C: drive permissions.  It seems odd that there should be such a variance in the C: (%systemdrive%) drive permissions across the above servers, with only
    scenarios 2 and 5 above have matching permissions.  I can only imagine that maybe some software or software update might be causing this.
    By reviewing the above output, it seems there is also a difference between the C: drive permissions of W2K8R2 Server Core and W2K8R2 Full.  Not sure if this is by design? 
    Is there any Microsoft Documentation describing what the default %systemdrive% NTFS permissions should be for W2K8R2 Server Core and Full.  Furthermore, do these permissions change when the various infrastructure roles are installed and enabled i.e.
    Domain Controller, DHCP etc.  I ask, since I would like to use the correct set of permissions for %systemroot% in each scenario. Please advise if I should be asking this question in a different forum?
    belpad

  • License type of SQL Server 2005 Best Practices Analyzer

    Hi everybody.
    I need to install in my organization the software "SQL Server 2005 Best Practices Analyzer" but I need to know if this application it's free licensing. I have seen on several web sites about this tool it's free but not in official microsoft
    web page. So, where can I find the official microsoft information about the type of licensing of "SQL Server 2005 Best Practices Analyzer" ?
    Thanks of your support

    Hello Erland.
    I followed your advice and I have read the terms of use of this software. I stop at point 3 (which I highlighted). Based on this point, I doubt it is about using this application. Furthermore nowhere says that this software is free to use.
    Would appreciate if someone can clarify this to me.
     =============================================================
    MICROSOFT SOFTWARE LICENSE TERMS
    MICROSOFT SQL SERVER 2005 BEST PRACTICES ANALYZER:
    These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its affiliates) and you. 
    Please read them.  They apply to the software named above, which includes the media on which you received it, if any. 
    The terms also apply to any Microsoft
    *  updates,
    *  supplements,
    *  Internet-based services, and
    *  support services
    for this software, unless other terms accompany those items. 
    If so, those terms apply.
    BY USING THE SOFTWARE, YOU ACCEPT THESE TERMS. 
    IF YOU DO NOT ACCEPT THEM, DO NOT USE THE SOFTWARE.
    If you comply with these license terms, you have the rights below.
    1. 
    INSTALLATION AND USE RIGHTS.  You may install and use any number of copies of the software on your devices.
    2. 
    INTERNET-BASED SERVICES.  Microsoft provides Internet-based services with the software. 
    It may change or cancel them at any time.
    3. 
    SCOPE OF LICENSE.  The software is licensed, not sold. This agreement only gives you some rights to use the software. 
    Microsoft reserves all other rights. 
    Unless applicable law gives you more rights despite this limitation, you may use the software only as expressly permitted in this agreement. 
    In doing so, you must comply with any technical limitations in the software that only allow you to use it in certain ways. 
    You may not
    *  work around any technical limitations in the software;
    *  reverse engineer, decompile or disassemble the software, except and only to the extent that applicable law expressly permits, despite this limitation;
    *  make more copies of the software than specified in this agreement or allowed by applicable law, despite this limitation;
    *  publish the software for others to copy;
    *  rent, lease or lend the software;
    *  transfer the software or this agreement to any third party; or
    *  use the software for commercial software hosting services.
    4. 
    BACKUP COPY.  You may make one backup copy of the software. 
    You may use it only to reinstall the software.
    5. 
    DOCUMENTATION.  Any person that has valid access to your computer or internal network may copy and use the documentation for your internal, reference purposes.
    6. 
    EXPORT RESTRICTIONS.  The software is subject to United States export laws and regulations. 
    You must comply with all domestic and international export laws and regulations that apply to the software. 
    These laws include restrictions on destinations, end users and end use. 
    For additional information, see www.microsoft.com/exporting.
    7. 
    SUPPORT SERVICES.  Because this software is "as is," we may not provide support services for it.
    8. 
    ENTIRE AGREEMENT.  This agreement, and the terms for supplements, updates, Internet-based services and support services that you use, are the entire agreement for the software and support services.
    9. 
    APPLICABLE LAW.
    a.  United States.  If you acquired the software in the United States, Washington state law governs the interpretation of this agreement and applies to claims for breach of it, regardless of conflict of laws principles. 
    The laws of the state where you live govern all other claims, including claims under state consumer protection laws, unfair competition laws, and in tort.
    b.  Outside the United States.  If you acquired the software in any other country, the laws of that country apply.
    10. 
    LEGAL EFFECT.  This agreement describes certain legal rights. 
    You may have other rights under the laws of your country. 
    You may also have rights with respect to the party from whom you acquired the software. 
    This agreement does not change your rights under the laws of your country if the laws of your country do not permit it to do so.
    11. 
    DISCLAIMER OF WARRANTY.  THE SOFTWARE IS LICENSED "AS-IS." 
    YOU BEAR THE RISK OF USING IT.  MICROSOFT GIVES NO EXPRESS WARRANTIES, GUARANTEES OR CONDITIONS. 
    YOU MAY HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT CANNOT CHANGE. 
    TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT EXCLUDES THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
    12. 
    LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES.  YOU CAN RECOVER FROM MICROSOFT AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP TO U.S. $5.00. 
    YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL, LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
    This limitation applies to
    *  anything related to the software, services, content (including code) on third party Internet sites, or third party programs; and
    *  claims for breach of contract, breach of warranty, guarantee or condition, strict liability, negligence, or other tort to the extent permitted by applicable law.
    It also applies even if Microsoft knew or should have known about the possibility of the damages. 
    The above limitation or exclusion may not apply to you because your country may not allow the exclusion or limitation of incidental, consequential or other damages.
    Please note: As this software is distributed in Quebec, Canada, some of the clauses in this agreement are provided below in French.

  • Best Practice Analyzer for Exchange 2013

    Greetings,
    I have upgraded the messaging infrastructure from Exchange 2007 to Exchange 2013.
    I want to test the Health of the system through ExBPA for Exchange 2013.
    But i don't find any setup for Exchange 2013 like it was in 2010.
    I went through an article by Office365 community, according to which for In-premises Exchange also we need to have office 365 account (can use trial account also) to get the downloader file for ExBPA 2013.
    http://community.office365.com/en-us/w/deploy/office-365-best-practices-analyzer-for-exchange-server-2013.aspx
    But to run the setup the servers needs to be connected to internet.
    And, i don't want to expose my environment to internet in any condition.
    Somebody, please suggest me if there is any setup available so that i can install directly without exposing to internet.
    Thanks in advance.
    Best Regards,
    K2

    Welcome to Exchange 2013.
    Exchange Server 2013 doesn't come with ExBPA for health check. This might help
    http://exchangeserverpro.com/powershell-script-health-check-report-exchange-2010/
    Apart from that you can run these commands too
    Get-ServerHealth -Identity Exchange2013ServerName
    Test-ServiceHealth
    Cheers,
    Gulab Prasad
    Technology Consultant
    Blog:
    http://www.exchangeranger.com    Twitter:
      LinkedIn:
       Check out CodeTwo’s tools for Exchange admins
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Running Best Practice Analyzer on remote 2008 R2 domain controllers

    Hello Powershell World,
    I'll start out by first mentioning that I am a powershell rookie so I gladly welcome any input to help me improve or work more efficiently.  Anyway, I recently used powershell to run the best practice analyzer for DNS on all of our domain controllers.
     The way I went about was pretty tedious and inefficient but still got the job done through a series of one-liners and exported the report to a UNC path as follows:
    Enable-PSremoting -Force (I logged into all of the domain controllers individually and ran this before running the one-liners below from my workstation)
    New-PSSession -Name <Session Name> -ComputerName <Hostname>
    Enter-PSSession -Name <Session Name>
    Import-Module bestpractices
    Invoke-BPAModel Microsoft/Windows/DNSServer
    Get-BPAResult Microsoft/Windows/DNSServer | Select ModelId,Severity,Category,Title,Problem,Impact,Resolution,Compliance,Help | Sort Category | Export-CSV \\server\share\BPA_DNS_SERVERNAME.csv
    I'm looking to do this again but for the Directory Services best practice analyzer without having to individually enable remoting on the domain controllers and also provide a lsit of servers for the script to run against. 
    Thanks in advance for all your help!

    What do you mean by "without having to individually enable remoting "?
    You cannot remote without enabling remoting.  You only need to enable remoting once.  It is a configuraiton change.  If you have done it once you do not need to do it again.
    Here is how to runfrom a list of DCs.
    $sb={
    Import-Module bestpractices
    Invoke-BPAModel Microsoft/Windows/DNSServer
    Get-BPAResult Microsoft/Windows/DNSServer |
    Select ModelId,Severity,Category,Title,Problem,Impact,Resolution,Compliance,Help |
    Sort Category |
    Export-CSV "\\server\share\BPA_DNS_$env:COMPUTERNAME.csv"
    Invoke-BPAModel Microsoft/Windows/DirectoryServices
    # etc...
    ForEach($dc in $listofDCs){
    Invoke-Command -ScriptBlock $sb -Computer $dc
    ¯\_(ツ)_/¯

  • SQL server Best Practice Analyzer output in .CSV

    Hi Team, I ran SQL server Best practice analyzer on our SQL 2008 R2 server. I was trying to export scan result in .csv format but it is only giving me option to save it in .xml format. I have been looking for ways to export output in such a way
    that it can be readable and I can send it to our clients but no luck.
    How can I export SQL BPA output in .csv or any other user friendly format?
    Thanks in Advance.

    Hi MSRS27,
    You can run Best Practices Analyzer (BPA) scans either from Server Manager, by using the BPA GUI, or by using cmdlets in Windows PowerShell. We can view or save BPA results from Windows PowerShell session in different format.
    If you want to export BPA results to a comma-separated values (CSV) text file, run the following cmdlet, where Path represents the path and text file name to which you want to save the CSV results.
     CSV results can be imported into Microsoft® Excel, or other programs that display data in spreadsheets or grids.
    Get-BPAResultModel ID| Export-CSVPath
    For more information, see: Run Best Practices Analyzer Scans and Manage Scan Results
    http://technet.microsoft.com/en-us/library/hh831400.aspx
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Best practice to deal with computer or departed employee

    Dear All,
        I would like to inquire about the best practice to deal with computer and computer account of a departed employee. should be disabled, reset, deleted, or just kept as it is until it is needed by another user?
    Regards
    hiam
    Hiam

    Ultimately your needs for their identities and equipment after they leave are what dictate how you should design this policy.
    First off, I recommend disabling the account immediatly following the employee's departure. This prevents the user from using their credentials to log on again. Personally I have a "Disabled Users" OU in Active Directory. When I disable accounts
    I move them here for easy future retrieval.
    It is possible the user may return, or if they have access to certain systems you may need the account again. I would keep the accounts for a specific amount of time (e.g. 6 months, but this depends on your needs) and then delete them after this period of
    time.
    If the employee knows the passwords to any shared accounts (not a good idea though many organizations have these) or has accounts in other systems that do not use Active Directory authentication, immediately change the passwords to these accounts again following
    the employee's departure.
    If the employee had administrative access to their computer (not a good idea, though is the reality in most cases) you should disable the computer account and remove it from the network. This will prevent the employee from remotely accessing the machine
    until you are able to rebuild or inspect it for unapproved changes.
    Ask the user's manager, team members, and subordinates if there are any files that the employee would have stored on their computer. Back these up as necessary.
    Most likely you will reuse the computer for another employee. For best results you should use an image so you can re-image their machine and not have to worry whether they had installed any unwanted software (backdoors, viruses, illegal software, etc).
    Hope this helps.
    Jason Warren
    @jaspnwarren
    jasonwarren.ca
    habaneroconsulting.com/Insights

Maybe you are looking for

  • ITunes keeps analysing my entire library each time I open it.

    Heading pretty much explains the problem. It has only recently started doing this. I've tried checking and unchecking the various crossfade playback buttons in prefs but that doesn't make any difference. How to make it stop.

  • Will LENOVO Tilapia CRB support AM3+ ?

    Hello, I was looking to upgrade my desktop with a AMD FX-6300 Six-Core processor(AM3+) . Will this work on my pc?: Current Processor: 3.10 gigahertz AMD Athlon II X4 645 64 kilobyte primary memory cache 512 kilobyte secondary memory cache 64-bit read

  • Oracle 9.2.0.1.0 link to sqlserver 2000 tables

    Hi! I have a sqlserver 2000 running. I also have a oracle 9.2 server with windows 2003 running. I installed after oracle enterprise edition setup the oracle gateway services for sqlserver on my oracle server. Then I made some changes in the different

  • Superdrive MATSHITADVD-R UJ-815 Media

    I have a Powerbook G4 with Superdrive UJ-815. I am having trouble finding DVD-R discs that work in this drive. Sometimes I get hardware error, and sometimes invalid media. Is there a recommeded DVD-R disc that proves reliable? Also, I found a URL tha

  • Application starting error -- item has already been added. key in dictionary 'regex' key being added 'regex'

    Hi All I have strucked with an issue with urlrewritingnet. I have developed a website and used urlrewritingnet for url redirecting. Application is working fine in DEV, Stage and Prod. but recently I am getting an issue in Stage. When I kept the brows