Cannot start a default data collector set in Server 2008 R2

Trying to start a default data collector set "System Performance" in Windows Server 2008 R2, but it won't start.
I am a domain and enterprise admin, and have actually added my domain credentials to the two Perfmon roles ("Performance Monitor Users" and "Performance log users"). Every time I try to start the "System Performance" Data Collector
set, it fails to start - the start function gets dimmed together with the stop function.
I check the task scheduler, and it seems everytime I have tried to run the data collector set, I get the following error logged:
"Logon Failure: the user has not been granted the requested logon type at this computer. (0x80070569)"
I can see that the error number means I do not have access to the machine, but I assure you that I'm logged in as a enterprise admin, the machine is the primary domain controller, and I have permission to do anything else on this machine and the domain.
I am not experiencing the same problem with two 2012 Servers or either of the two VMs running on this 2008 R2 Server.
Thanks in advance.
FMilani

Hi,
The 0x80070569 error typically caused by the permission issue, If are the enterprise admin, please try the following steps to confirm there haven’t other GPO permission reason.
1. Create a new OU and put the this server in it
2. Create a Group Policy object for this new OU
3. We blocked inheritance of other policies (if applicable)
4. Make sure we did not set ANY user rights in our new policy
5. We edited the local user rights add your account as local admin.
6. Run the cmdlet gpupdate /force.
More information:
Starting or Live Migrating Hyper-V virtual machines may fail with error 0x80070569 on Windows Server 2012-based computers
http://support.microsoft.com/kb/2779204/en-us
Hope this helps.
We
are trying to better understand customer views on social support experience, so your participation in this
interview project would be greatly appreciated if you have time.
Thanks for helping make community forums a great place.

Similar Messages

  • Exchange 2013 Performance Monitor Data Collector Sets

    I've noticed that the Exchange 2013 install created two data collector sets within Performance Monitor:
    ExchangeDiagnosticsDailyPerformanceLog
    ExchangeDiagnosticsPerformanceLog
    These sets are creating daily log files saved in C:\Program Files\Microsoft\Exchange Server\V15\Logging\Diagnostics\DailyPerformanceLogs that are about 500MB in size, which are filling up our system volume.
    I've tried stopping them, but they get restarted automatically. I also tried deleting them, but they get recreated automatically. Is there a way to disable these? We have a 3rd party tool that we use for performance monitoring and would like to avoid having
    to delete these log files on a regular basis to keep our system volume from filling up.
    As a workaround, I've changed the path where the log files get saved to a non existent drive letter, but would still like to disable the Data Collector Sets completely.
    Thanks,
    -Cory

    Do note,
    These performance counters only take up 1 week and not more so it will be about 3-5 GB if you can`t spare this on you c disk you have not implemented any best practise exchange design.
    specialy with todays JBOD SATA design your c disk will be 1TB +. Review your designs to match MS advise.
    MCTS-MCITP exchange 2010 | MCTS-MCITP Exchange: 2007 | MCSA Messaging: 2003 | MCP windows 2000
    Hello Martin,
    Many of us are implementing exchange in a virtual environment, where disk is at a premium.  Spending money on a 1 TB system drive is not a reasonable expectation.  Even if that is best practice, there should be a way to manage the location and
    size of these log files. 
    The inability to manage the location of log/performance data for centralization, or other reasons, is one more example of the disconnect between MS and their customer's use cases in regards to 2013. 
    These are the requirements for 2013 from MS:
    At least 30 GB on the drive on which you install Exchange
    An additional 500 MB of available disk space for each Unified Messaging (UM) language pack that you plan to install
    200 MB of available disk space on the system drive
    A hard disk that stores the message queue database on with at least 500 MB of free space.
    So your statement is invalid as 30 GB is not even close to deal with the amount of log and performance data that we cannot modify in any way. 

  • Cannot start second instance of collector

    Hi everyone
    I've got a strange one here. A workstation I noticed that wasn't scanned since 6 months (WinXP SP3, ZCM 10.3.1).
    When I check the ZAA windows, it says Last Scanned i.e. today, Last Upload June 2.
    I've found several errors 'Cannot start second instance of collector' in the local colw32.log.
    However, the workstation gets rebooted daily and I can't find another instance of colw32.exe running in TaskManager.
    How can I resolve this? I can't tell how many clients are affected. However, I've got almost 300 clients (15%) that haven't been scanned in the last 180 days...
    Thanks
    Roland

    Originally Posted by rpfenninger
    Hi everyone
    I've got a strange one here. A workstation I noticed that wasn't scanned since 6 months (WinXP SP3, ZCM 10.3.1).
    When I check the ZAA windows, it says Last Scanned i.e. today, Last Upload June 2.
    I've found several errors 'Cannot start second instance of collector' in the local colw32.log.
    However, the workstation gets rebooted daily and I can't find another instance of colw32.exe running in TaskManager.
    How can I resolve this? I can't tell how many clients are affected. However, I've got almost 300 clients (15%) that haven't been scanned in the last 180 days...
    Thanks
    Roland
    We had similar issue, inventory was never uploaded to server. To solve it we had to manually do a "Scan Now" from Zenworks agent -> Show properties -> Inventory -> Scan Now, after we did that once on the workstation then scheduled scans started to upload again.
    I guess it would probably also work if you do a Inventory scan from quick tasks in ZCC but I never tried it so I'm not sure.
    Thomas

  • ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified in windows server 2008 r2

    ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified in windows server 2008 r2.I made a application in asp.net c#.I am using ODBC connection.When I deployed my application in windows server2008 r2.There
    is no Microsoft ODBC driver shown in ODBC Data source administrator.Then I go to the C:\Windows\SysWOW64 and open  Odbcad32.exe and add Microsoft ODBC2 driver for Oracle and when I run my application I got following error
    ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified
    I am using follwoing string
     <connectionStrings>
    <add name="theconnetion" connectionString="DSN=abdb;UID=abc;PWD=xyz"/>
     </connectionStrings>
    Guide me What I do?

    Did you add a System DSN or a User DSN? If you added a User DSN from your own login, the asp.net application will not be able to use it unless its application Pool in IIS is configured to run under the same credentials that you used for creating
    the DSN. It's better if you add a System DSN.
    Also, be careful to ensure that you are using a 64 bit DSN, unless you configure the application to run in 32 bits. If the 64 bit application attempts to use the 32 bit driver you get the same error message "Data source name not found and no default
    driver specified". See this KB article:
    http://support.microsoft.com/kb/942976/en-us

  • Data services with SQL Server 2008 and Invalid time format variable

    Hi all
    Recently we have switched from DI on SQL Server 2005, to DS(Date Services) on SQL Server 2008. However I have faced an odd error on the query that I was running successfully in DI.
    I validate my query output using a validation object to fill either Target table (if it passes), or the Target_Fail table (if it fails). Before sending data to the Target_Fail table, I map the columns using a query to the Target_Fail table. As I have a column called 'ETL_Load_Date' in that table, which I should fill it with a global variable called 'Load_Date'. I have set this global variable in the script at the very first beginning of the job. It is a data variable type:
    $Load_Date = to_char(sysdate(),'YYYY.MM.DD');
    When I assign this global variable to a datetime data type cloumn in my table and run the job using Data Services, I get this error:
    error message for operation <SQLExecute>: <[Microsoft][ODBC SQL Server Driver]Invalid time format>.
    However I didn't have this problem when I was running my job on the SQL Server 2005 using Data Integrator. The strange thing is that, when I debug this job, it runs completely successfully!!
    Could you please help me to fix this problem?
    Thanks for your help in advance.

    Thanks for your reply.
    The ETL_Date is a datetime column and the global variable is date data type. I have to use the to_char() function to be able to get just the date part of the current system datetime. Earlier I had tried date_part function but it returns int, which didn't work for me.
    I found what the issue was. I don't know why there were some little squares next to the name of the global variable which I had mapped to the ETL_Date in the query object!!! The format and everything was OK, as I had the same mapping in other tables that had worked successfully.
    When I deleted the column in the query object and added it again, my problem solved.

  • Start Element - Default Date - Error

    Hi,
    We have NW04s SP7 Portal and our VC version is
    VC Server Version is 645.7.1.0.
    1. In some scenarios we want some default list (data from SAP R/3) to be displayed by passing system date, i tried using NOW() option, it is not working.In our report we want to display the absentee list for system date, we have a form in which user can change the date and report is refreshed. Second part is working fine, when the user changes the date it display the correct data, default list for system date is not getting displayed.I have used START connect element to invoke the default action by setting the current date using NOW() but it is not working.Any ways to debug the same (i want to know what value is really passed to back end system).
    Please let me know your thoughts/solutions to the above mentioned issue.
    Thanks
    Senthil

    Hi Jarrod,
    NOW() command works fine when i use it for a field in a FORM, but if i use the NOW() option in Start Element it is not working.
    We ran the RFC, SQL trace in R/3, trace says that the X function module is called, but it does not show the values which are passed.
    Can you guide us on the debug part ? Any thing has to be done in R/3 like increasing the trace level etc.
    Thanks
    Senthil

  • Date Format in Zreport similar to the default date format set using su01.

    Dear All,
                I have a requirement as to change the date format in Zreport as the one which is set in default date format for the user in Su01 tcode.
    Ex.. If the user has set the default date format in SU01 as ' YYYY-MM-DD' , in report also, date should appear as '2009-12-24'.
    Is there any code  or  Function Module to change the date format in report as in the default Date format?
    Can u help me in this?
    Bye.....
    Cheers
    Christina.

    Try the following code.
    select the format of current user from usr01 table.
        SELECT SINGLE datfm FROM usr01
        INTO w_datfm
        WHERE bname = sy-uname.
    Format based upon the current user settings. put this total code inside one form for reusability.
        IF w_datfm = '1'.
          CONCATENATE w_date6(2) c_dot w_date4(2) c_dot w_date+0(4) INTO outtab-value.
          CONDENSE outtab-value NO-GAPS.
          CONCATENATE w_rundate6(4) w_rundate3(2) w_rundate+0(2) INTO w_date.
          CONDENSE w_date NO-GAPS.
        ELSEIF w_datfm = '2'.
          CONCATENATE w_date4(2) '/' w_date6(2) '/' w_date+0(4) INTO outtab-value.
          CONDENSE outtab-value NO-GAPS.
          CONCATENATE w_rundate6(4) w_rundate0(2) w_rundate+3(2) INTO w_date.
          CONDENSE w_date.
        ELSEIF w_datfm = '3'.
          CONCATENATE w_date4(2) '-' w_date6(2) '-' w_date+0(4) INTO outtab-value.
          CONDENSE outtab-value NO-GAPS.
          CONCATENATE w_rundate6(4) w_rundate0(2) w_rundate+3(2) INTO w_date.
          CONDENSE w_date.
        ELSEIF w_datfm = '4'.
          CONCATENATE w_date0(4) '-' w_date4(2) '-' w_date+6(2) INTO outtab-value.
          CONDENSE outtab-value NO-GAPS.
          CONCATENATE w_rundate0(4) w_rundate5(2) w_rundate+8(2) INTO w_date.
          CONDENSE w_date NO-GAPS.
        ELSEIF w_datfm = '5'.
          CONCATENATE w_date0(4) '/' w_date4(2) '/' w_date+6(2) INTO outtab-value.
          CONDENSE outtab-value NO-GAPS.
          CONCATENATE w_rundate0(4) w_rundate5(2) w_rundate+8(2) INTO w_date.
          CONDENSE w_date NO-GAPS.
        ENDIF.

  • Backup Error : The Data Is Invalid Windows Server 2008 R2

    Hello,
    I have a Problem when I try to make Backups to Partition C in Windows Server 2008 R2:
    Error The Data is Invalid.
    Obs: I can make backups for System Reserved Partition .
    I had the same problem 2 months ago (the data is invalid), Then I folowed this the advice to delete "Framework64\v2.0.50727\Temporary
    ASP.NET Files" and worked for me , from this post: http://social.technet.microsoft.com/Forums/en-US/windowsbackup/thread/89052b85-9c9b-479f-ab76-a20da3cc4696
    Now I have the same problem and I delete the
    Temporary ASP.NET Files , but it is not working , I received the same error on backup:
    "The backup operation that started at '‎2012‎-‎09‎-‎20T07:13:26.024163900Z' has failed with following error code '2147942413'. Please review the event details for a solution, and then rerun the backup operation once the issue is resolved."
    When I try vssadmin list writers, Everything is ok , "No error" .
    I tried to run :
    net stop "System Event Notification Service"
    net stop "COM+ Event System"
    net stop "Microsoft Software Shadow Copy Provider"
    net stop "Volume Shadow Copy"
    cd /d %windir%\system32
    net stop vss
    net stop swprv
    regsvr32 /s ole32.dll
    regsvr32 /s oleaut32.dll
    regsvr32 /s vss_ps.dll
    vssvc /register
    regsvr32 /s /i swprv.dll
    regsvr32 /s /i eventcls.dll
    regsvr32 /s es.dll
    regsvr32 /s stdprov.dll
    regsvr32 /s vssui.dll
    regsvr32 /s msxml3.dll
    net start "System Event Notification Service"
    net start "COM+ Event System"
    net start "Microsoft Software Shadow Copy Provider"
    net start "Volume Shadow Copy"
    But is not Working
    Any Advice ?
    Thank's

    Hello Lucian,
    the issue may be caused by an invalid entry inside the following registry sub tree.
    HKey_Local_Machine\Software\Microsoft\Windows NT\CurrentVersion\ProfileList
    Please open the registry editor with regedit.
    Expand and local to the subtree, check if there is an entry that has a ".bak" value appended. If so, this may be cause the failure when trying to resolve the SID of the writer.
    Please backup the registry key first, and then delete that entry with the extra ".bak" and reboot the server then test the backup
    Run the command "vssadmin list writers" and make sure all the writers are stable with no errors
    Please look for errors in application event logs after you initiate backup on the server
    Sincerely, Asifkhan -- Please mark my post helpful if it was really helpful to you.

  • Problem in creating DATA Model from SQL SERVER 2008 in BI PUBLISHER

    Dear Team,
    I connect BI Publisher with SQL SERVER 2008 But On creating Report on BI,when we create data model...dataset,
    i select the tables but when i click on RESULT i am geting this error.
    error--
    [Hyperion][SQLServer JDBC Driver][SQLServer]Invalid object name 'DBNAME.DBO.TABLE'.
    please resolve this problem...
    Thanks,
    Him
    Edited by: h on Aug 22, 2011 6:31 PM

    Hi David,
    The things I said are not a fix for this problem.
    If your RCU installation worked, then you do not have to worry about modifying the createfr.sql.
    Edit:
    I've just tracked the problem. It appears that when using the query builder, BI forgets to add the " sign.
    For example:
    This query will give the hyperion error.
    select     "table"."field"
    from     "database.user"."table"
    To correct it write it like this:
    select     "table"."field"
    from     "database"."user"."table"
    Edited by: EBA on Nov 14, 2011 10:21 AM

  • Error 18452 "Login failed. The login is from an untrusted domain and cannot be used with Windows authentication" on SQL Server 2008 R2 Enterprise Edition 64-bit SP2 clustered instance

    Hi there,
    I have a Windows 2008 R2 Enterprise x64 SP2 cluster which has 2 SQL Server 2008 R2 Enterprise Edition x64 SP2
    instances.
    A domain account "Domain\Login" is administrator on both physcial nodes and "sysadmin" on both SQL Server instances.
    Currently both instances are running on same node.
    While logging on to SQL Server instance 2 thru "Domain\Login" using "IP2,port2", I get error 18452 "Login failed. The login is from an untrusted domain and cannot be used with Windows authentication". This happened in the past
    as well but issue resolved post insatllation of SQL Server 2008R2 SP2. This has re-occurred now. But it connects using 'SQLVirtual2\Instance2' without issue.
    Same login with same rights is able to access Instance 1 on both 'SQLVirtual1\Instance1' and "IP1,port1" without any issue.
    Please help resolve the issue.
    Thanks,
    AY

    Hello,
    I Confirm that I encountred the same problem when the first domain controller was dow !!
    During a restarting of the first domain controller, i tried to failover my SQL Server instance to a second node, after that I will be able to authenticate SQL Server Login but Windows Login returns Error 18452 !
    When the firts DC restart finishied restarting every thing was Ok !
    The Question here : Why the cluster instance does'nt used the second DC ???
    Best Regards     
    J.K

  • SQL Developer Data Modeler for SQL Server 2008

    I am not able to connect to my SQL Server 2008 from the SQL Developer Data Modeler. Although I do have jtds-1.2.jar on my machine and I can connect the SQL server through the SQL Developer, but still i'm not able to connect through the data modeler. I need to re-engineer and generate data model for some existing schemas.
    Here is what I'm following:-
    File->Data Modeler -> import -> Data Dictionary -> Add new connection -> JDBC ODBC Bridge -> Other Third Party Driver
    now when I'm giving the JDBC URL and the Driver, It throws an error message stating that the driver could not be found.
    Please let me know what I can do to solve this, any help would be appreciated.
    Regards,
    AVA

    I'd try 1st to connect to the db from sqldeveloper (through jtds - no ODBC involved) and see if you can browse your db and issue sql statements in a worksheet. Then export that connection in xml format and import that from the modeler.

  • Creating Data Source to SQL Server 2008 in SharePoint Designer 2013

    Hello,
     I have been trying to create a Data Source connection to a SQL Server 2008 Database. I use a custom string and I choose the table to display and hit ok. When I try to click on the connection to edit it I get the following error.

    Hi Derek,
    According to your description, my understanding is that the error occurred when you edited the Data Source connected to SQL Server.
    How did you create the Data Source connected to SQL server using custom string?
    I recommend to connect to the database by saving the user name and password to see if the issue still occurs.
    More information are provided in the link below:
    http://office.microsoft.com/en-us/sharepoint-designer-help/add-a-database-as-a-data-source-HA010355745.aspx
    Best regards.
    Thanks
    Victoria Xia
    TechNet Community Support

  • Cannot start Web Cache in Oracle 10g App Server

    Hi,
    Our Application runs on Oracle10g App Server,
    When it tried to start WebCache using the command
    opmnctl startproc ias-component=WebCache process-type=WebCache
    it shows the error...
    opmn id=oracle10g:6200
    no enabled components for this request
    When i tried to enable WebCache using Enable/Disable Component in Oracle 10g Enterprise manager - I can't see WebCache component it to enabe or disable it.
    When i tried opmnctl status., it shows
    ------------------------------------------------+---------
    ias-component | process-type | pid | status
    ------------------------------------------------+---------
    HTTP_Server | HTTP_Server | 7276 | Alive
    LogLoader | logloaderd | N/A | Down
    dcm-daemon | dcm-daemon | N/A | Down
    OC4J | home | 7684 | Alive
    OC4J | rhs | N/A | Down
    OC4J | instaremit | 5880 | Alive
    OC4J | insta_test_apr_02 | 7100 | Alive
    WebCache is not running, also as the WebCache is not listed in Oracle 10g Enterprise manager, i can' enable or disable it.
    Plz Help me over this.,
    Regards.,
    Deepak.C

    Hi.,
    Thanks again !!!
    Yes, My Oracle HTTP Server is listening on port 80.
    I am not clear, may i know why should the ports be changed for the request to go through Web Cache ???
    Can i assign port 80 for Web Cache and 7778 for HTTP Server ?
    Also In Oracle Enterprise Manager, Web Cache is not available...
    home, HTTP Server, management alll these are there...
    Also for HTTP Server, it's not showing any metric info, 'Not Yet Available' is shown even though the HTTP Server is running, how can i get metric info about Oracle HTTP Server, i have also used dmstool -t ohs_server, its not working
    It's showing... FLEXMON ERROR : detected invalid table name ohs_server
    dmstool -l -t also does'nt lists ohs_server
    Hellp me over this.....
    Deepak.C

  • Uploading data simultaneously in sql server 2008

    Hi All,
    I have a scenario like multiple users loading different data into a tool.Till now my scripts were allowing 1 user to load a data at a time. Now it should allow many users to upload their data simultaneously(or atleast the illusion that the data is being
    uploaded at the same time).
    IF tables are being locked during an import then as soon as the table is unlocked user 2's data should start processing.
    If table locking must occur, l should lock the table until that table is finished being updated. This is the case,First user loads XL spreadsheet which has data into the tool. once it is loaded and import is clicked , In database first 
    Assetstaging -------->Centerlinestaging------->Misc_asset are populated these are the main tables which populates first. Then my procedures get executed. At this point i should restrict ,i.e User1 loads the data main table populates,at same time user
    2 loads data ,So how will i handle this kind of scenario? I got a hint using locking concept i should do, I am not aware about that..is there any way i can do please help me.I am new to to this.
    Deepa

    1. First application accepts an XL spreadsheet which has a bulk data
    2. Once import is done Staging tables are loaded first. Till now only one user was able to load the data. Now requirement is like multiple users should load different data. I should lock at staging tables. 
    I dont have any idea about locking concept. How can i lock the staging table? what should i do when other users are also loading data at same time?
    Below is the staging table structure
    CREATE TABLE dbo.Misc_AssetTypeStaging
        Point                 int                NULL,
    Misc_AssetType        varchar(100)       NULL,
    BeginMilepostPrefix   varchar(5)         NULL,
    BeginMilepostSuffix   varchar(5)         NULL,
    BeginMilepost         decimal(15, 8)     NULL,
    BeginTrackName        varchar(20)        NULL,
    BeginLatitude         decimal(15, 8)     NULL,
    BeginLongitude        decimal(15, 8)     NULL,
    BeginElevation        decimal(15, 8)     NULL,
    EndLatitude           decimal(15, 8)     NULL,
    EndLongitude          decimal(15, 8)     NULL,
    EndElevation          decimal(15, 8)     NULL,
    EndMilepostPrefix     varchar(5)         NULL,
    EndMilepostSuffix     varchar(5)         NULL,
    EndMilepost           decimal(15, 8)     NULL,
    EndTrackName          varchar(20)        NULL,
    SubdivisionName        varchar(20)       NULL,
    Direction             varchar(16)        NULL,
        Speed                 int                NULL,
    TrainType             varchar(20)        NULL,
    Operator              varchar(2)         NULL,
    QualifierType         varchar(15)        NULL,
    RestrictionParameter  int                NULL,
    RestrictionType       varchar(20)        NULL,
    FBARDirection        varchar(16)        NULL,
    SignalAuthorityType  varchar(16)        NULL,
    YardLimits           varchar(1)         NULL,
    CabSignalType        varchar(16)        NULL,
    PTCType              varchar(15)        NULL
    Please help me.Thanks in advance
    Deepa

  • Cannot get gpo to work on the domain - server 2008 standard r2

    gpreport.txt attached thanks for the tip i was in the wrong directory as it was a vm it does that when opening cmd! 

    please can i get some help with this issue it's been going on for weeksno matter what i do i cannot get this simple gpo to install adobe flash player active x onto a test pc in an ouit works fine in my hyper v lab environment so it must be the messed up way it's setup on the real domain where someone else set it up before me, i've attached 5 screenshots joined together as 1 picplease help me understand this situation i've read so much about group policy and still cannot work this simple task out, i'm having to get round software installs in bulk by using psexec and i'd rather do it through gp
    there's a pic of a working gpo called document redirection that i put on here for reference, the one i cant get working is the adobe flash player active x onealso, on the lab environment i haven't had to add anything in the user settings just the...
    This topic first appeared in the Spiceworks Community

Maybe you are looking for