Utility data collection job Failure on SQL server 2008
Hi,
I am facing data collection job failure issue (Utility-data Collection) on SQL server 2008 server for, below is the error message as :
<service Name>. The step did not generate any output. Process Exit Code 5. The step failed.
Job name is collection_set_5_noncached_collect_and_upload, as I gothrough the google issue related to premission issue but where exactly the access issues are coimng, this job is running on proxy account. Thanks in advance.
Hi Srinivas,
Based on your description, you encounter the error message after configuring data collection in SQL Server 2008. For further analysis, could you please help to collect detailed log information? You can check the job history to find the error log around the
issue, as is mentioned in this
article. Also please check Data Collector logs by right-clicking on Data Collection in the Management folder and selecting View Logs.
In addition, as your post, the exit code 5 is normally a ‘Access is denied ’ code. Thus please make sure that the proxy account has admin permissions on your system. And ensure that SQL Server service account has rights to access the cache folder.
Thanks,
Lydia Zhang
Similar Messages
-
How to export data with column headers in sql server 2008 with bcp command?
Hi all,
I want know "how to export data with column headers in sql server 2008 with bcp command", I know how to import data with import and export wizard. when i
am trying to import data with bcp command data has been copied but column names are not came.
I am using the below query:-
EXEC master..xp_cmdshell
'BCP "SELECT * FROM [tempdb].[dbo].[VBAS_ErrorLog] " QUERYOUT "D:\Temp\SQLServer.log" -c -t , -T -S SERVER-A'
Thanks,
SAAD.Hi All,
I have done as per your suggestion but here i have face the below problem, in print statment it give correct query, in EXEC ( EXEC master..xp_cmdshell @BCPCMD) it was displayed error message like below
DECLARE @BCPCMD
nvarchar(4000)
DECLARE @BCPCMD1
nvarchar(4000)
DECLARE @BCPCMD2
nvarchar(4000)
DECLARE @SQLEXPRESS
varchar(50)
DECLARE @filepath
nvarchar(150),@SQLServer
varchar(50)
SET @filepath
= N'"D:\Temp\LDH_SQLErrorlog_'+CAST(YEAR(GETDATE())
as varchar(4))
+RIGHT('00'+CAST(MONTH(GETDATE())
as varchar(2)),2)
+RIGHT('00'+CAST(DAY(GETDATE())
as varchar(2)),2)+'.log" '
Set @SQLServer
=(SELECT
@@SERVERNAME)
SELECT @BCPCMD1
= '''BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT '
SELECT @BCPCMD2
= '-c -t , -T -S '
+ @SQLServer +
SET @BCPCMD
= @BCPCMD1+ @filepath
+ @BCPCMD2
Print @BCPCMD
-- Print out below
'BCP "SELECT
* FROM [tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername'
EXEC
master..xp_cmdshell
@BCPCMD
''BCP' is not recognized as an internal or external command,
operable program or batch file.
NULL
if i copy the print ourt put like below and excecute the CMD it was working fine, could you please suggest me what is the problem in above query.
EXEC
master..xp_cmdshell
'BCP "SELECT * FROM
[tempdb].[dbo].[wErrorLog] " QUERYOUT "D:\Temp\LDH_SQLErrorlog_20130313.log" -c -t , -T -S servername '
Thanks, SAAD. -
I am facing a strange SQL exception:-
The code flow is like this:
.Net 4.0 --> Entity Framework --> SQL 2008 ( StoredProc --> Function {Exception})
In the SQL Table-Valued Function, I am selecting a column (nvarchar(50)) from an existing table and (after some filtration using inner joins and where clauses) inserting the values in a Table Type Object having a column (nvarchar(50))
This flow was working fine in SQL 2008 but now all of sudden the Insert into @TableType is throwing "string or binary data would be truncated" exception.
Insert Into @ObjTableType
Select * From dbo.Table
The max length of data in the source column is 24 but even then the insert statement into nvarchar temp column is failing.
Moreover, the same issue started coming up few weeks back and I was unable to find the root cause, but back then it started working properly after few hours
(issue reported at 10 AM EST and was automatically resolved post 8 PM EST). No refresh activity was performed on the database.
This time however the issue is still coming up (even after 2 days) but is not coming up in every scenario. The data set, for which the error is thrown, is valid and every value in the function is fetched from existing tables.
Due to its sporadic nature, I am unable to recreate it now :( , but still unable to determine why it started coming up or how can i prevent such things to happen again.
It is difficult to even explain the weirdness of this bug but any help or guidance in finding the root cause will be very helpful.
I also Tried by using nvarchar(max) in the table type object but it didn't work.
Here is a code similar to the function which I am using:
BEGIN
TRAN
DECLARE @PID
int = 483
DECLARE @retExcludables
TABLE
PID
int NOT
NULL,
ENumber
nvarchar(50)
NOT NULL,
CNumber
nvarchar(50)
NOT NULL,
AId
uniqueidentifier NOT
NULL
declare @PSCount int;
select @PSCount =
count('x')
from tblProjSur ps
where ps.PID
= @PID;
if (@PSCount = 0)
begin
return;
end;
declare @ExcludableTempValue table (
PID
int,
ENumber
nvarchar(max),
CNumber
nvarchar(max),
AId
uniqueidentifier,
SIds
int,
SCSymb
nvarchar(10),
SurCSymb
nvarchar(10)
with SurCSymbs as (
select ps.PID,
ps.SIds,
csl.CSymb
from tblProjSur ps
right
outer join tblProjSurCSymb pscs
on pscs.tblProjSurId
= ps.tblProjSurId
inner join CSymbLookup csl
on csl.CSymbId
= pscs.CSymbId
where ps.PID
= @PID
AssignedValues
as (
select psr.PID,
psr.ENumber,
psr.CNumber,
psmd.MetaDataValue
as ClaimSymbol,
psau.UserId
as AId,
psus.SIds
from PSRow psr
inner join PSMetadata psmd
on psmd.PSRowId
= psr.SampleRowId
inner join MetaDataLookup mdl
on mdl.MetaDataId
= psmd.MetaDataId
inner join PSAUser psau
on psau.PSRowId
= psr.SampleRowId
inner
join PSUserSur psus
on psus.SampleAssignedUserId
= psau.ProjectSampleUserId
where psr.PID
= @PID
and mdl.MetaDataCommonName
= 'CorrectValue'
and psus.SIds
in (select
distinct SIds from SurCSymbs)
FullDetails
as (
select asurv.PID,
Convert(NVarchar(50),asurv.ENumber)
as ENumber,
Convert(NVarchar(50),asurv.CNumber)
as CNumber,
asurv.AId,
asurv.SIds,
asurv.CSymb
as SCSymb,
scs.CSymb
as SurCSymb
from AssignedValues asurv
left outer
join SurCSymbs scs
on scs.PID
= asurv.PID
and scs.SIds
= asurv.SIds
and scs.CSymb
= asurv.CSymb
--Error is thrown at this statement
insert into @ExcludableTempValue
select *
from FullDetails;
with SurHavingSym as (
select distinct est.PID,
est.ENumber,
est.CNumber,
est.AId
from @ExcludableTempValue est
where est.SurCSymb
is not
null
delete @ExcludableTempValue
from @ExcludableTempValue est
inner join SurHavingSym shs
on shs.PID
= est.PID
and shs.ENumber
= est.ENumber
and shs.CNumber
= est.CNumber
and shs.AId
= est.AId;
insert @retExcludables(PID, ENumber, CNumber, AId)
select distinct est.PID,
Convert(nvarchar(50),est.ENumber)
ENumber,
Convert(nvarchar(50),est.CNumber)
CNumber,
est.AId
from @ExcludableTempValue est
RETURN
ROLLBACK
TRAN
I have tried by converting the columns and also validated the input data set for any white spaces or special characters.
For the same input data, it was working fine till yesterday but suddenly it started throwing the exception.Remember, the CTE isn't executing the SQL exactly in the order you read it as a human (don't get too picky about that statement, it's at least partly true enough to say it's partly true), nor are the line numbers or error messages easy to read: a mismatch
in any of the joins along the way leading up to your insert could be the cause too. I would suggest posting the table definition/DDL for:
- PSMetadata, in particular PSRowID, but just post it all
- tblProjectSur, in particularcolumns CSymbID and TblProjSurSurID
- cSymbLookup, in particular column CSymbID
- PSRow, in particular columns SampleRowID, PID,
- PSAuser and PSUserSur, in particualr all the USERID and RowID columns
- SurCSymbs, in particular colum SIDs
Also, a diagnostic query along these lines, repeat for each of your tables, each of the columns used in joins leading up to your insert:
Select count(asurv.sid) as count all
, count(case when asurv.sid between 0 and 9999999999 then 1 else null end) as ctIsaNumber
from SurvCsymb
The sporadic nature would imply that the optimizer usually chooses one path to the data, but sometimes others, and the fact that it occurs during the insert could be irrelevant, any of the preceding joins could be the cause, not the data targeted to be inserted. -
Best way to migrate SharePoint 2003 data into SQL Server 2008
Hi Experts,
I am planning to migrate data from SharePoint 2003 into SQL Server 2008. After the migration SharePoint site will be deleted in couple of months and then that data will be feed into .Net Front end application
1) What is the best and easy way to do it?
2) Is there any way to automate the migration process? i.e. If a new record gets entered into SharePoint it should be populated into SQL Server 2008.
3) Any other suggestions
Thanks,
JohnDear John,
If it's just a few lists, and you just want to import them "as-is" then it should be possible to do so ... and survive to tell the about it ;-)
Generally speaking, You will need to write a small process (program or script) to read/parse each list and check if the item(s) are in the target table (I assuming that there is a distinct table as target
for each list, and that each list has 'something" you can use to distinct each row), if it's not there, the just add them according to your needs.
Then just rerun the process periodically and it would keep your databases up to date (you could even set ti up to update those records that have changes, but that would delay your process significantly)
What i just described is doable, and not TOO complicated, it could be done i a lot different ways, and with different alternatives of programming/scripting languages. for sure you can do it in any flavor
of .net language, and even powershell.
As I mentioned, this is speaking in general, the actual implementation would depend on your specific needs and the kind of data that you have/need to keep.
Best Regards / Saludos, Marianok
Disclaimer: This post, and all included code and information is provided "AS IS" with no warranties or guarantees and confers no rights. Try it at your own risk, I take no responsibilities.
Aclaración: Esta publicación, y todo en código e información en la misma, es provista "AS IS" / "Como Está" sin garantía alguna y no le confiere ningún derecho. Pruebelo su propio riesgo. No asumo responsabilidad alguna. -
Creating Data Source to SQL Server 2008 in SharePoint Designer 2013
Hello,
I have been trying to create a Data Source connection to a SQL Server 2008 Database. I use a custom string and I choose the table to display and hit ok. When I try to click on the connection to edit it I get the following error.Hi Derek,
According to your description, my understanding is that the error occurred when you edited the Data Source connected to SQL Server.
How did you create the Data Source connected to SQL server using custom string?
I recommend to connect to the database by saving the user name and password to see if the issue still occurs.
More information are provided in the link below:
http://office.microsoft.com/en-us/sharepoint-designer-help/add-a-database-as-a-data-source-HA010355745.aspx
Best regards.
Thanks
Victoria Xia
TechNet Community Support -
Data Collection job collection_set_2_upload failed
We had a standalone server which was SQL server 2000 R2. I had configued the Data Collection job and uploded the data into our Management Data Warehouse. It had worked fine. One week ago, the standalone server was re-built and
then I found that all 4 jobs didn't work.
collection_set_2_collection
collection_set_2_upload
collection_set_3_collection
collection_set_3_upload
I re-configured both Management Data Warehouse and Data Collection. Now the two collection_set_2_collection and collection_set_3_collection worked fine. However the two collction_set_2_upload and collection_set_3_upload didn't work with an error "the
step did not generate any output". I cleaned up all DataCollectorCache files on the standalone server but the error stayed.
Any idea? Thank you for any kind of suggestion.
Charlie HeHi Charlie,
Based on my understanding, you configured Data Collection job. Then two upload jobs didn't work with an error "the step did not generate any output". And error also existed after cleaning up Data Collector cache files.
New data cannot be uploaded to the Management Data Warehouse database when one or more Data Collector cache files are corrupted. The corruption may be caused for one of the following reasons:
Data Collector encountered an exception.
The disk runs out of free space while Data Collector is writing to a cache file.
A firmware or a driver problem occurs.
So I suggest you double check if Data Collection cache files are cleaned up completely. For more information, please refer to this kb:
http://support.microsoft.com/kb/2019126.
If issue persists exist, please check the agent job history to find the error log around the time issue caused. It would be better if you could provide detail error information for our further analysis. About how to check agent job history, please refer
to this link:
http://msdn.microsoft.com/en-us/library/ms181046(v=sql.110).aspx.
Best regards,
Qiuyun Yu -
Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion
First of all...I have a headache!
Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
Specifically receiving this error...
An OLE DB record is available. Source: "Microsoft Office Access Database Engine" Hresult: 0x80004005 Description: "External table is not in the expected format.".
Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager"
failed with error code 0xC0202009.
Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
@[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
Why does this have to be sooooooo difficult???
Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
so that is stops erroring out?Hi ITBobbyP,
According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
The error can be caused by the following reasons:
The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
https://www.microsoft.com/en-us/download/details.aspx?id=13255.
The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
If you have any more questions, please feel free to ask.
Thanks,
Wendy Fu
Wendy Fu
TechNet Community Support -
Hi,
I have to implement the following scenario in SSIS but don't know how to do since I never worked with SSIS before. Please help me.
I have 20 different text files in a single folder and 20 different tables corresponding to each text file in SQL Server 2008 R2 Database. I need to extract the data from each text file and
load the data into corresponding table in Sql Server Database. Please guide me in how many ways I can do this and which is the best way to implement this job. Actually I have to automate this job. Few files are in same format(with same column names
and datatypes) where others are not.
1. Do I need to create 20 different projects ?
or
Can I implement this in only one project by having 20 packages?
or
Can I do this in one project with only one package?
Thanks in advance.As I said I don't know how to use object data type, I just given a shot as below. I know the following code has errors can you please correct it for me.
Public
Sub Main()
' Add your code here
Dim f1
As FileStream
Dim s1
As StreamReader
Dim date1
As
Object
Dim rline
As
String
Dim Filelist(1)
As
String
Dim FileName
As
String
Dim i
As
Integer
i = 1
date1 =
Filelist(0) =
"XYZ"
Filelist(1) =
"123"
For
Each FileName
In Filelist
f1 = File.OpenRead(FileName)
s1 = File.OpenText(FileName)
rline = s1.ReadLine
While
Not rline
Is
Nothing
If Left(rline, 4) =
"DATE"
Then
date1 (i)= Mid(rline, 7, 8)
i = i + 1
Exit
While
End
If
rline = s1.ReadLine
End
While
Next
Dts.Variables(
"date").Value = date1(1)
Dts.Variables(
"date1").Value = date1(2)
Dts.TaskResult = ScriptResults.Success
End
Sub -
Data services with SQL Server 2008 and Invalid time format variable
Hi all
Recently we have switched from DI on SQL Server 2005, to DS(Date Services) on SQL Server 2008. However I have faced an odd error on the query that I was running successfully in DI.
I validate my query output using a validation object to fill either Target table (if it passes), or the Target_Fail table (if it fails). Before sending data to the Target_Fail table, I map the columns using a query to the Target_Fail table. As I have a column called 'ETL_Load_Date' in that table, which I should fill it with a global variable called 'Load_Date'. I have set this global variable in the script at the very first beginning of the job. It is a data variable type:
$Load_Date = to_char(sysdate(),'YYYY.MM.DD');
When I assign this global variable to a datetime data type cloumn in my table and run the job using Data Services, I get this error:
error message for operation <SQLExecute>: <[Microsoft][ODBC SQL Server Driver]Invalid time format>.
However I didn't have this problem when I was running my job on the SQL Server 2005 using Data Integrator. The strange thing is that, when I debug this job, it runs completely successfully!!
Could you please help me to fix this problem?
Thanks for your help in advance.Thanks for your reply.
The ETL_Date is a datetime column and the global variable is date data type. I have to use the to_char() function to be able to get just the date part of the current system datetime. Earlier I had tried date_part function but it returns int, which didn't work for me.
I found what the issue was. I don't know why there were some little squares next to the name of the global variable which I had mapped to the ETL_Date in the query object!!! The format and everything was OK, as I had the same mapping in other tables that had worked successfully.
When I deleted the column in the query object and added it again, my problem solved. -
Data sources (ODBC) config -MS SQL Server linking
OS -WS2008 R2-64
I realize that the problems I'm experiencing are not one’s everyday’s activities and are falling quite astray of the regular Oracle dba work.
There is no other place to post but here as the MS SQL Server forums were not able to provide any solution so far.
I hope somebody could have experienced similar issues and could have solutions or at least to point to the right direction.
The box has a MS SQL Server 2008 that has to be linked with an Oracle 10g database on a different box(SunOS 5.10).
The source for the ODBC drivers supposed to be Oracle Client + ODAC.
The simple scenario of linking SQL Server to Oracle got developed into an ugly battle with Oracle drivers to be recognized by the SQL Server.
Numerous installations-de-installations of the Oracle Client 10g + ODAC and Oracle Client 11 + ODAC led to nowhere.
The major points of failures are:
1. The OracleMTS Recovery Service gets broken and can not start - the installation of ODAC fails.
Following the Metalink: Recreating Oracle MTS Recovery Service [ID 836137.1]
fails even with domain admin account - "access denied".
2. In case when the Oracle Client + ODAC get a successful installation:
- no effort could make the oracle drivers to ne recognized by the SQL server.
Even with config of Data Sources (ODBC) and successful connection test against the Oracle databases -> there is no way to have the oracle OraOLEDB.Oracle listed in the SQL Server dropdown window with available drivers.
3. Data Sources (ODBC) - fail to initiate reporting bogus reasons.
As there are 32 and 64 bit of applications (ODBC Managers) and corresponding – or not – drivers------ there is a reportedly confusion over the forums which drivers are for which OS.
The bottom line is – linking fails (something I have done successfully before).
I have the feeling there is something wrong with our environments, but have no way to prove it.
I’m posting with a little hope for help.
Thx,WS2008 is 64 bit
MS SQL Server 2008 is 64-bit
Solaris is 64-bit
Oracle is 64-bit
Oracle's OLE DB - as OraOLEDB.Oracle
or Microsoft OLE DB Provider for ODBC Drivers (MSDASQL) .
The road block is that I can not configure a Data Sources (ODBC) - System DSN with any of OraOLEDB.Oracle or MSDASQL because they are not listed on the list of available drivers!!!!
Source for OraOLEDB.Oracle has to be the Client11g - ODAC installation. - at no avail
Source for MSDASQL should be the OS by itself as the driver is built in. - at no avail
(the SQL server recognizes the MSDASQ - listed under Linked servers providers).
As per the information from Microsoft and other forums - the mess is based upon 32 - 64 bit applications and drivers; on the way the installations were done.
Even the server administrator advises for a new installation.
Thx, -
JDeveloper 11.1.1.2.0 - Panel Dashboard using Data from SQL Server 2008
Hi All,
I am using JDeveloper 11.1.1.2.0 and am trying to create a cross application dashboard. This dashboard would show 2 panel's of data from an Oracle Database (Already completed this) and two panel's of data from a SQL Server 2008 Database.
I have successfully installed a SQL Server 2008 database locally on my machine. I have downloaded the JDBC drivers from the Microsoft website and I have two JAR Files:
* sqljdbc.jar
* sqljdbc4.jar
I have created a SQL Server connection within the JDeveloper connections panel and I can successfully query my database.
I have created a New application module and set its connection to a JDBC connection with the credentials specified within the Connections tab. I then added the two JAR files as libraries to the Model project and
ran the Application Module tester. This succesfully shows some records.
Within the View Project I Created a new page and dropped the View Object on as a table. I then go to Run the page and get an error relating to the JDBC drivier not being found. I have read a few posts about adding it to Weblogic lib files and modifying the start up scripts but can anyone give me a definitive working solution? I also tried changing my Application Module to using a DataSource, but I also have the same problem within the Weblogic console (it cannot create a data source for SQL SERVER as it does not have the drivers).
Any help is greatly appreciated.
ThanksThanks for the responses.
I have edited C:\oracle\Middleware\wlserver_10.3\common\bin\commEnv.cmd to add the entries as follows...
set WEBLOGIC_CLASSPATH=%JAVA_HOME%\lib\tools.jar;%BEA_HOME%\utils\config\10.3\config-launch.jar;%WL_HOME%\server\lib\weblogic_sp.jar;%WL_HOME%\server\lib\weblogic.jar;%FEATURES_DIR%\weblogic.server.modules_10.3.2.0.jar;%WL_HOME%\server\lib\webservices.jar;%ANT_HOME%/lib/ant-all.jar;%ANT_CONTRIB%/lib/ant-contrib.jar;%WL_HOME%\server\lib\sqljdbc.jar;%WL_HOME%\server\lib\sqljdbc4.jar
You can now see the additional entries at the end of the classpath.
The Weblogic home is set as follows set WL_HOME=C:\oracle\Middleware\wlserver_10.3
I have placed the files in C:\oracle\Middleware\wlserver_10.3\server\lib
I restarted the server. Logged into the Weblogic console and tried to create a datasource with the Microsoft Drivers but still no luck. I gave the Oracle Microsoft Driver a go and it seemed to work! I'm guessing that there is a driver bundle in. -
Syspolicy_purge_history job getting failed (SQL Server 2012)
Job Name syspolicy_purge_history
Step Name Erase Phantom System Health Records.
Duration 00:00:05
Sql Severity 0
Sql Message ID 0
Operator Emailed
Operator Net sent
Operator Paged
Retries Attempted 0
Error Message :-
A job step received an error at line 1 in a PowerShell script. The corresponding line is 'import-module SQLPS -DisableNameChecking'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'The following error occurred
while loading the extended type data file:
Microsoft.PowerShell, D:\Program Files (x86)\Microsoft SQL Server\110\Tools\PowerShell\Modules\SQLPS\sqlprovider.types.ps1xml : File skipped because of the following validation exception: AuthorizationManager check failed..Hi nap_bhatt,
Based on your description, we need to verify that if you get the error while starting the syspolicy_purge_history job manually in SQL Server 2012. If so, which account do you use to log on to the SQL Server instance and run the job?
According to the error message “AuthorizationManager check failed” , it indicates that your account has no permission to execute the PowerShell script in SQL Server Agent job. To troubleshoot this issue, please make sure that your account has permissions
to the SQLPS folder and it has rights to execute the PowerShell script.
In addition, please use the “Get-ExecutionPolicy” PowerShell command to check the current status of execution policy, if the execution policy is set to “Restricted”, try to set it to “RemoteSigned” or “Unrestricted”. Please note that you need to set this
execution policy in both the Windows PowerShell and the SQL PowerShell consoles.
For more information about syspolicy_purge_history job in SQL Server 2012, please review this similar blog:
SQL job “syspolicy_purge_history” fails on SQL 2012 .
Thanks,
Lydia Zhang -
Failure installing Sql Server 2012 x64
Hello,
I am trying to install Sql Server 2012 x64 with SP1, but I am running into the following error(s). It occurs almost immediately after I run "setup.exe":
<error>
TITLE: SQL Server Setup failure.
SQL Server Setup has encountered the following error:
There was a failure to initialize a setting from type Microsoft.SqlServer.Configuration.SetupExtension.InstallSharedDirSetting.
Error code 0x8564000E.
For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft%20SQL%20Server&EvtSrc=setup.rll&EvtID=50000&EvtType=0x9F9575BA%25400x38AD03A5
BUTTONS:
OK
</error>
Any ideas what else I can do to get past this?Well... Don't know when I downloaded the original ISO image file. I re-downloaded, and now it extracted, and I ran it, which now I get the expected SQL Server Installation Center.
-
Bug in SQL Server 2008 R2 for Change Data Capture (CDC)
I'm pretty sure I've encountered a bug in SQL Server 2008 R2 that's been fixed in 2012, regarding changing the design of a database using CDC. With CDC disabled on a table with sys.sp_cdc_disable_table, I can add a column or change a column's data
type, but when I call sp_rename, on 2008 R2 I'm getting "Cannot alter column 'xxx' because it is 'REPLICATED'.", even though the table's properties shows "Table is replicated" as False. In 2012, works fine.
Even calling sys.sp_cdc_disable_db didn't prevent this error.Feel free to file a request on http://connect.microsoft.com
Balmukund Lakhani | Please mark solved if I've answered your question, vote for it as helpful to help other users find a solution quicker
This posting is provided "AS IS" with no warranties, and confers no rights.
My Blog |
Team Blog | @Twitter
Author: SQL Server 2012 AlwaysOn -
Paperback, Kindle -
Data Committed instead of Rollback after Deadlock Error in SQL Server 2008 R2 (SP2)
We're having a strange issue which is occurring only with one Customer having SQL Server 2008 R2 (SP2).
Basically we have multiple threads uploading data and when an error occurs (like deadlock or any other error). The deadlock victim (process/transaction) is rolledback (from .NET). However the rollback command is not reaching SQL Server as it doesn't show
in the trace (through SQL Profiler).
To make things worse, not only the transaction is not being rolled back but the INSERTs executed before the error are being somehow committed, leaving the database in an inconsistent state.
This is only produced in one environment.
Any idea what the issue could be?All statements are executed with in a Transaction. Under the same scenario this code works perfectly fine for 1000s of customers. Only one customer has this issue.
You need to capture profiler to check transaction scope.
Balmukund Lakhani
Please mark solved if I've answered your question, vote for it as helpful to help other users find a solution quicker
This posting is provided "AS IS" with no warranties, and confers no rights.
My Blog |
Team Blog | @Twitter
| Facebook
Author: SQL Server 2012 AlwaysOn -
Paperback, Kindle
Maybe you are looking for
-
Dear all, I`m trying to figure out what went wrong with my database. During the higher load the users were suddenly getting DBIF_RSQL_SQL_ERROR, no matter what they wanted to do. When checking system I found many messages like: Database error -1000 a
-
Hello, I have installed the infrastructure successfully but when I install the Middle Tier (in it's separate home) the installer gets up to 88% and then displays the error: Error in invoking target 'proxy_install runm_install server_install cgi_insta
-
Application Virtualization 4.6 (SoftGrid) Acrobat Pro 9.2 and Office 2007
I'm actually trying to sequence Acrobat Pro 9.2 together with Office 2007 in a dynamic suite composition with Application Virtualization 4.6 RC. I can sequence both applications and deploy them to the client. Everything seems to work fine, except the
-
Upgrade LR5 to PS CC...
Hey, I bhought my LR5 few weeks ago and i'd like to upgrade to PS CC. LR5 is included in PS CC and I'd not be happy to pay LR5 twice... Is there a discount system for upgrade like this ? Thanks, Harold (Paris, France)
-
Using LabView and Modbus to Communicat​e with AKD Servo Drive
I'm working on a project to control a Kollmorgen AKD servo application using Modbus directly from LabVIEW. I'm new to both LabVIEW and servo drives so forgive me if this seems simple. I have established communications to the drive through Modbus an