Slow query on remote server

Hi
I am running a query as below;
INSERT INTO tblEvents( <fields list> )
SELECT <fields list>
FROM OPENROWSET('SQLNCLI', 'Server=<ip address>;DATABASE=MyDB;Uid=sa;Pwd=MyPassword;', 'SELECT * FROM tblEvents') AS a
WHERE (ID = 68596)
Problem is that below part of query runs very slow (takes around 3 mins to complete) even though it just needs to bring one row.
SELECT <fields list>
FROM OPENROWSET('SQLNCLI', 'Server=<ip address>;DATABASE=MyDB;Uid=sa;Pwd=MyPassword;', 'SELECT * FROM tblEvents') AS a
WHERE (ID = 68596)
What is the problem and how can I speed it up?
Thanks
Regards

Hi,
as Ronen already mentioned, the syntax you are using will not push anything from the confition down the provider stack to the data source, it will pull the whole data and the filter it on the client side. (where you execute the query).
You can either use the syntax that Ronen already pointed out OR use a linked Server along with the existing provider for SQL Server. In most cases the provider is able to push down the conditions to the data source, have it executes here and then bring back
only the relevant rows. This depends on the query you are using and the ability of the driver to translate the conditions.
See more on creating linked servers here:
http://msdn.microsoft.com/en-us/library/aa560998.aspx
-Jens
Jens K. Suessmeyer
http://blogs.msdn.com/Jenss

Similar Messages

  • Does the local server send the query to remote server?

    I have 2 servers,a local server connect with remote server through database link.If I execute a query at local server but only access tables in remote server,does local server send this query to remote server or all the tables at remote server will send to local server?If local server send this query to remote server,does it can execute another query while it waits the result send from remote server?
    Any help would be appreciated
    Regards

    Yes is executed on remote , and the remote can accept queries as normal see my example below
    SQL> select * from v$database@prod_p;
    Execution Plan
    Plan hash value: 3039639316
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time
    | Inst |
    | 0 | SELECT STATEMENT REMOTE| | 100 | 64400 | 0 (0)| 00:00:0
    1 | |
    | 1 | MERGE JOIN CARTESIAN | | 100 | 64400 | 0 (0)| 00:00:0
    1 | |
    |* 2 | FIXED TABLE FULL | X$KCCDI | 1 | 582 | 0 (0)| 00:00:0
    1 | RPRD |
    | 3 | BUFFER SORT | | 100 | 6200 | 0 (0)| 00:00:0
    1 | |
    | 4 | FIXED TABLE FULL | X$KCCDI2 | 100 | 6200 | 0 (0)| 00:00:0
    1 | RPRD |
    Predicate Information (identified by operation id):
    2 - filter("DI"."INST_ID"=USERENV('INSTANCE'))
    Note
    - fully remote statement

  • Bdc performance in "No screen" mode, on slow network, on remote server.

    Hi,
    We have created a bdc program in ABAP.
    say, our servers are installed in US. and our users are trying to execute bdc by uploading a data file from UK.
    consider that the connectivity(network) between our US office and UK office is not so goood.
    i understand that the upload of the data file will take time. but after the upload, for bdc execution, how the bdc performance gets impacted by the network performance.
    if we execute the bdc in All screen mode, i am convinced that the bdc will definetly will take a long time to execute, becasue UK user needs to press "Enter" several no. of times, and so many pings will happen to the server.
    if we execute the bdc in No Screen mode, does it meen it is completely background process.
    i mean in this case also, is the screen info passed to and from between the server(US) and the sapgui(UK).
    or
    once the data upload is over, in  No Screen mode, the entire activity is done in server, and the control comes back to sapgui at the end.
    We have put some logs, and we found that most of the time is taken by the CALL TRANSACTION statement
    CALL TRANSACTION 'PP01' USING it_bdcdata
    is there any other points to improve the bdc execution on a remote server.
    thanks in advance,
    Madhu_1980

    Hi Sandra ,
    Thanks for your suggestion throiugh the link.
    Had checked the trouble shooting for  BDC : but the scenario in which i experienc eth eproblem is little different.
    Well the Interface to create vendors was working perfectly in ALL Screen as well as NO-SCREEN mode.
    But a small requirement seem to change its execution.
    The Interface had a commented code snippet : retrieve email address from ADR6 table. Now according to the new requirement this code snippet needs to be utilized. So i uncommented that and duly passed th etable entries into the concerned screen (the code for that screen aldready exsits in the recording but not utilized since the retrieval was commented)
    Collects the email address for any vendor from ADr6 table
    Moderator message - Please respect the 2,500 character maximum when posting. Post only the relevant portions of code
    now for linking the above created vendor, FK02 BDC is maintained. In this BDC the payment transaction screen is used to link payee vendors (above created vendor)  at company code level for any principal vendors (selected from lfa1 and lfb1 table). So only two screens for linking .
    this entire code was working perfect in both A mode and  N mode  but after including the ADR6 code snippet i dont find teh similar execution in N mode.
    Could you please guide me where could i ahve made some mistake.
    thanks
    Kylie
    Edited by: kylietisha on Jun 6, 2010 5:27 PM
    Edited by: kylietisha on Jun 6, 2010 5:42 PM
    Edited by: Rob Burbank on Jun 6, 2010 4:33 PM

  • Error: Load operation failed for query 'GetAuthenticationInfo'. The remote server returned an error: NotFound.

    Hello,
    I have a lightswitch web-application in development, which I need to copy from one computer to the other. I have tried doing it both through Git and by simply copying the solution and opening the project on another machine. The project builds without errors,
    but when I try to debug it, it opens a web-browser, loads to 100% and pops up an error - Load operation failed for query 'GetAuthenticationInfo'. The remote server returned an error: NotFound.
    Now, I have tried repairing Visual Studio on my machine, reinstalling .NET framework and setting  <basicAuthentication enabled="false" /> in web.config, yet it still does not run.
    When using Fiddler, it shows an error while loading the application - "HTTP/1.1 500 Internal Server Error" , which I honestly don't know what it means.
    The application uses ComponentOne and Telerik modules, but they are both installed on both machines. 
    The application does run perfectly on the original machine, but it is not working on any other one.
    Both machines are using Win 8.1 and Visual Studio 2013 Update 4.
    I have tried to look this up online, but most people's problem are when they are deploying the app, not just debugging. I would be really happy for any help with this issue.
    Thanks!

    I have the same problem on one of my development machines. Whenever I create a new project, the System.IdentityModel.Tokens.Jwt nuget package is not referenced properly. The project compiles correctly but you are not able to debug as I get the same error
    as you.
    If you open up your references and there is an error next to any of your references make sure that you correct them. In the case of the jwt reference error, I have to remove the jwt reference and then add it back from the packages folder.
    This may not be your problem but could point you in a direction?

  • Best way to submit a query recordset to a remote server

    Hello, everyone.
    I've been tasked with a project that will involve querying a table on one server, then sending the data to a remote server that will take the data and insert it into another database (the two db cannot communicate directly with each other.)
    I've never had to do something like this, before.  What is the best way to achieve the desired results?  Is there a way to "submit" a query (anywhere from 1 to 5 records at any given time) to the remote server without incorporating WDDX?
    CF9 environment, soon to be CF10.
    V/r,
    ^_^

    Hi, @BKBK,
    Not permitted.
    It's kind of tedious, but what I've worked, so far, is to have CF1 select data from DB1, SerializeJSON the data, use cfhttp to POST the data to CF2.
    THEN, CF2 DeserializeJSON the data back into a query object and insert that into DB2.
    Unless you think there's a better way to do it?
    V/r,
    ^_^

  • Power Query online search: Unable to connect to the remote server

    This is the error I get after I click add to worksheet. I see th preview of the data. I am in an enterprise behind a proxy if that helps.
    [DataSource.Error] OData: Request failed (ConnectFailure): Unable to connect to the remote server
    Stephen

    OData is a web service and so long as you can access it from your current network context, it'll work. Home, Internet Café, Hotel, Airplane -- doesn't matter. If the data source is private and is on a restricted network (like some sort of private data analysis
    source internal to your company), then you will need to VPN into that network before you attempt to refresh your data.
    I have noted some basic steps in my blog that might help you setup the basics and verify with a known publically available OData source.
    http://bariseris.wordpress.com/2014/10/27/using-excel-power-query-to-track-your-citys-progress/

  • Robohelp 7 App is really slow in deploying files to the remote server - Why?

    Hi,
    I am totally new to RoboHelp. So Please forgive my ignorance!
    The Robohelp we have was developed by somebody who is not with our company anymore. I think he used Robohelp 5 to develop our Policies and Procedures. The files are deployed on one of our remote server. (Do the server have to have Robohelp installed as well?? I checked the server to find if it has Robohelp installed and I could not find it?)
    One of our Analyst has Robohelp on her PC. She used to have Robohelp 5 and now she has Robohelp 7 to deploy the files to the server.
    It was working perfectly fine till she used Robohelp 5. Deploying the files to the server took about 2 hours before and now with Robohelp 7 on her PC, it takes forever and we are unable to deploy the files to the server from RoboHelp application!
    What should I be doing to make it faster? Please let me know. Thanks in advance!!
    Uma

    Hello again
    Well, ummm, how shall I put this? YIKES! The page you linked to is for RoboHelp 9. Now you may be scratching your head at this point and saying waitadoggoneminnute there MR. Man!What gives?
    Here's the deal. Way way back when a little company called eHelp corporation held the reigns, RoboHelp was at version 7. Then version 8 came along but wasn't really called version 8. It was called version 2000 because it was released in the year 2000. Then they published version 9. Then came version 10 and that one was called version 2002. Then version 11 shipped but was called X3. Then version 12 shipped and was called X4 and finally "lucky" version 13 arrived and was called X5. Enter another small company known as Adobe. Adobe saw the 5 and decided to name their next version "6".
    This wouldn't really be much of a problem, EXCEPT, we now see material that is grossly outdated that would seem to apply to today's versions and of course they are wrong!
    So here's where it gets fun. Note that the date of the article you pointed to is October, 2000. Nearly ten years old! I'm guessing that the dialogs you are seeing are a wee bit different than what are on that page.
    So just to be complete:
    When you double-click the WebHelp layout you see the first of several dialogs. It usually looks like this:
    Note the highlighted part. This is the location where WebHelp will generate to as well as the name of the Start Page. Oftentimes folks configure this location to be the location where everyone points to open the WebHelp from the connected PCs. And if this is the case, that is likely the reason the process is so slow.
    The location specified here should always point only to the C drive.
    If you click the Next > button three times you should see the screen below.
    THIS is where you would want to configure the publishing destination.
    So you should have two actions. One action creates the files and the second action Publishes (or copies) the files from the C drive to the server.
    Note that none of the options have been enabled in the dialog above. That's typically the preferred way to work. You only need to enable these options under specific conditions. So if you have this configured but have enabled perhaps the Republish all option, perhaps that would also explain the slow operation.
    Hopefully this helps... Rick
    Helpful and Handy Links
    RoboHelp Wish Form/Bug Reporting Form
    Begin learning RoboHelp HTML 7 or 8 moments from now - $24.95!
    Adobe Certified RoboHelp HTML Training
    SorcererStone Blog
    RoboHelp eBooks

  • Connecting to remote server slows down computer

    We've been connecting to a remote server for years, and only recently have this problem:
    When connected to a remote server, the entire system runs sluggish. This only happens with one remote server -- other connections don't cause this problem. The system is slow even if we are not reading or writing -- just the connection seems to cause the slowdown.
    What could be causing this?

    You need to enable WinRM on your system. Open powershell on the system you want to connect to remotely and enable it by typing:
    C:>enable-psremoting -force
    WinRM already is set up to receive requests on this machine.
    WinRM has been updated for remote management.
    Created a WinRM listener on HTTP://* to accept WS-Man requests to any IP on this machine.
    This will configure WinRM for you. Check out this technet powershell training session.
    http://technet.microsoft.com/en-au/events/ee676904.aspx
    ref: http://social.technet.microsoft.com/Forums/en/ITCG/thread/54417b48-68ae-40ac-a303-74aed4e7e81a
    Satyam MCITP, MCPD

  • A remote server runs code terribley slow

    Hello,
    I'm in an odd development situation. I write code on my local
    dev server, which gets handed off to someone else's dev server that
    tests it before putting it up live. I have NO direct access to the
    remote servers and upload my files by emailing them over
    individually with instructions on where to place them. (I warned
    you it was odd).
    I've written a small search page which does the following:
    does a verity search, if a result is not cached then the file is
    parsed then the information is cached. It sorts the results, then
    displays them. It caches by serializing a struct of structs using
    wddx.
    On my local server this works almost instantaneously,
    literally half a second for the largest "all" search, but on the
    remote server it TIMES OUT! Doing searches (that run near instantly
    locally) for less items takes upwards of 2 full minutes.
    The error that coldfusion throws is that cffile write is
    timing out. (It rewrites the cache if it finds new files in the
    results).
    The code is identical on both my local, and the remote
    server.
    I'm a little clueless as to why this is happening, and my
    background has never been in coldfusion before. If anyone could
    shed any light on why this is happening, or where to look to figure
    out why it's happening, it would be greatly appreciated!
    Thank you so much for any help you can give.

    Kronin, you bring up a good point, and that has been the main
    focus I've had so far. In my local environment I am searching
    through roughly 1100 files. The Cache file i'm making comes out to
    about 800KB. In the remote environment I'm only searching through
    roughly 2000 files, and I am assuming that the cache is growing at
    the same rate.
    These aren't terribly big numbers I'm working with, even the
    oldest hardware should be able to pull this off without a hitch.
    Even if cffile loads it into memory first, there would surely be
    1MB of memory available, and if not a swap shouldn't take long
    enough to cause a time out.
    Keep the ideas coming though :)

  • Building Flex application with Flex Builder in a Remote Server - Cloud Computing

    Hey Guys
    I have a query or its confusion probably
    I was just thinking if this kind of solution is existing or possible with say cloud computing or anything else?
    This is what we all probably do for building our flex application?
    1. There is Remote Server hosting our source code
    2. Developer check out the code in their system locally by some Configuration tool like Perforce, CVS etc
    3. Developer Install Flex  Builder IDE locally and open the IDE and then create.build locally and then on testing check in the code in repository
    All this is fine but we have some problem here
    a. We have to get all the source code checked out/downloaded in some local machine
    b. We have to install Flex Builder in each every developer machine with license.
    A possible modification in the above can be as follows
    1. Don't download/check out the source code locally in each machine
    2. Create a mapped network drive of the Remote Repository and then work directly with the mapped network drive
    3. Install Flex Builder  locally
    4. Start Flex Builder create a Flex Project with source code in the mapped network drive
    But this has some potential problem as now the building of the flex application directly work with the mapped network drive. This is very slow especially when we Keep the "Build Automatically" Check box to true
    My question is that can we have a solution like this ?
    1. There is Remote Server hosting our source code
    2. There is another 2nd Remote Server which support workspace for each user
    3. Flex Builder installed in the 2nd Remote Server
    4. Each Developer connects to the 2nd Remote Server log in to their workspace
    5. Each developer check out code by connecting to 1st Remote Server. This code which is checked out now goes in their workspace in the 2nd Remote Server
    6. Start Flex Builder instance which is running in the 2nd Remote Server
    5. Each developer open source code modifies, build application in their workspace and check
    I think this is something that I heard cloud computing can do ? Do anyone have any idea whether this kind of environment is possible in Flex or can anyone suggest something which is almost close to this ?
    Regards
    Biswamit

    Hi
    The concept of cloud computing is not even very clear to me
    I think this is what I understood about it ...........from this link http://communication.howstuffworks.com/cloud-computing.htm
    It says
    "Instead of installing a suite of software for each computer, you'd only have to load one application. That application would allow workers to log into a Web-based service which hosts all the programs the user would need for his or her job. Remote machines owned by another company would run everything from e-mail to word processing to complex data analysis programs. It's called cloud computing, and it could change the entire computer industry"
    "There's a good chance you've already used some form of cloud computing. If you have an e-mail account with a Web-based e-mail service like Hotmail, Yahoo! Mail or Gmail, then you've had some experience with cloud computing. Instead of running an e-mail program on your computer, you log in to a Web e-mail account remotely"
    My issue is not with the license. The solution that I am looking for is
    1. Don't want the developer to download the source code in any fashion either checked out or anything else in the local machine
    2. Want the developer to work directly on the server
    3. Looking for a solution where each developer is not required to install Flex Builder locally and the developer can use Flex Builder that is installed on the server and in his workspace and create/modify build on the server itself
    Hope I could make you understand this time ................
    Regards
    Biswamit

  • SharePoint 2010 Slow query duration when setting metadata on folder

    I'm getting "Slow Query Duration" when I programmatically set a default value for a default field to apply to documents at a specified location on a SP 2010 library.
    It has nothing to do with performance most probably as I'm getting this working with a folder within a library with only a 1 document on a UAT environment. Front-end: AMD Opteron 6174 2.20GHz x 2 + 8gb RAM, Back-end: AMD Opteron 6174 2.20GHz x 2 + 16gb
    RAM.
    The specific line of code causing this is:
    folderMetadata.SetFieldDefault(createdFolder, fieldData.Field.InternalName, thisFieldTextValue);
    What SP says:
    02/17/2014 16:29:03.24 w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa42 Monitorable A large block of literal text was sent to sql. This can result in blocking in sql and excessive memory use on the front end. Verify that no binary parameters are
    being passed as literals, and consider breaking up batches into smaller components. If this request is for a SharePoint list or list item, you may be able to resolve this by reducing the number of fields.
    02/17/2014 16:29:03.24 w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa43 High Slow Query Duration: 254.705556153086
    02/17/2014 16:29:03.26 w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High Slow Query StackTrace-Managed: at Microsoft.SharePoint.Utilities.SqlSession.OnPostExecuteCommand(SqlCommand command, SqlQueryData monitoringData) at Microsoft.SharePoint.Utilities.SqlSession.ExecuteReader(SqlCommand
    command, CommandBehavior behavior, SqlQueryData monitoringData, Boolean retryForDeadLock) at Microsoft.SharePoint.SPSqlClient.ExecuteQueryInternal(Boolean retryfordeadlock) at Microsoft.SharePoint.SPSqlClient.ExecuteQuery(Boolean retryfordeadlock) at Microsoft.SharePoint.Library.SPRequestInternalClass.PutFile(String
    bstrUrl, String bstrWebRelativeUrl, Object punkFile, Int32 cbFile, Object punkFFM, PutFileOpt PutFileOpt, String bstrCreatedBy, String bstrModifiedBy, Int32 iCreatedByID, Int32 iModifiedByID, Object varTimeCreated, Object varTimeLastModified, Obje...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ...ct varProperties, String bstrCheckinComment, Byte partitionToCheck, Int64 fragmentIdToCheck, String bstrCsvPartitionsToDelete, String bstrLockIdMatch, String bstEtagToMatch,
    Int32 lockType, String lockId, Int32 minutes, Int32 fRefreshLock, Int32 bValidateReqFields, Guid gNewDocId, UInt32& pdwVirusCheckStatus, String& pVirusCheckMessage, String& pEtagReturn, Byte& piLevel, Int32& pbIgnoredReqProps) at Microsoft.SharePoint.Library.SPRequest.PutFile(String
    bstrUrl, String bstrWebRelativeUrl, Object punkFile, Int32 cbFile, Object punkFFM, PutFileOpt PutFileOpt, String bstrCreatedBy, String bstrModifiedBy, Int32 iCreatedByID, Int32 iModifiedByID, Object varTimeCreated, Object varTimeLastModified, Object varProperties,
    String bstrCheckinComment, Byte partitionToCheck, Int64 fragmentIdToCheck...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ..., String bstrCsvPartitionsToDelete, String bstrLockIdMatch, String bstEtagToMatch, Int32 lockType, String lockId, Int32 minutes, Int32 fRefreshLock, Int32 bValidateReqFields,
    Guid gNewDocId, UInt32& pdwVirusCheckStatus, String& pVirusCheckMessage, String& pEtagReturn, Byte& piLevel, Int32& pbIgnoredReqProps) at Microsoft.SharePoint.SPFile.SaveBinaryStreamInternal(Stream file, String checkInComment, Boolean checkRequiredFields,
    Boolean autoCheckoutOnInvalidData, Boolean bIsMigrate, Boolean bIsPublish, Boolean bForceCreateVersion, String lockIdMatch, SPUser modifiedBy, DateTime timeLastModified, Object varProperties, SPFileFragmentPartition partitionToCheck, SPFileFragmentId fragmentIdToCheck,
    SPFileFragmentPartition[] partitionsToDelete, Stream formatMetadata, String etagToMatch, Boolea...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ...n bSyncUpdate, SPLockType lockType, String lockId, TimeSpan lockTimeout, Boolean refreshLock, Boolean requireWebFilePermissions, Boolean failIfRequiredCheckout, Boolean
    validateReqFields, Guid newDocId, SPVirusCheckStatus& virusCheckStatus, String& virusCheckMessage, String& etagReturn, Boolean& ignoredRequiredProps) at Microsoft.SharePoint.SPFile.SaveBinary(Stream file, Boolean checkRequiredFields, Boolean
    createVersion, String etagMatch, String lockIdMatch, Stream fileFormatMetaInfo, Boolean requireWebFilePermissions, String& etagNew) at Microsoft.SharePoint.SPFile.SaveBinary(Byte[] file) at Microsoft.Office.DocumentManagement.MetadataDefaults.Update()
    at TWINSWCFAPI.LibraryManager.CreatePathFromFolderCollection(String fullPathUrl, SPListItem item, SPWeb web, Dictionary2...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ... folderToCreate, Boolean setDefaultValues, Boolean mainFolder) at TWINSWCFAPI.LibraryManager.CreatePathFromFolderCollection(String fullPathUrl, List1 resultDataList,
    SPListItem item, SPWeb web, Boolean setDefaultValues, Boolean mainFolder) at TWINSWCFAPI.LibraryManager.CreateExtraFolders(List1
    pathResultDataList, List1 resultDataList, String fullPathUrl, SPWeb web, SPListItem item, Boolean setDefaultValues) at TWINSWCFAPI.LibraryManager.CreateFolders(SPWeb web, List1
    pathResultDataList, SPListItem item, String path, Boolean setDefaultValues) at TWINSWCFAPI.LibraryManager.MoveFileAfterMetaChange(SPListItem item) at TWINSWCFAPI.DocMetadataChangeEventReceiver.DocMetadataChangeEventReceiver.FileDocument(SPWeb web, SPListItem
    listItem) at TWINSWCFAPI.DocMetadataChang...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ...eEventReceiver.DocMetadataChangeEventReceiver.ItemCheckedIn(SPItemEventProperties properties) at Microsoft.SharePoint.SPEventManager.RunItemEventReceiver(SPItemEventReceiver
    receiver, SPUserCodeInfo userCodeInfo, SPItemEventProperties properties, SPEventContext context, String receiverData) at Microsoft.SharePoint.SPEventManager.RunItemEventReceiverHelper(Object receiver, SPUserCodeInfo userCodeInfo, Object properties, SPEventContext
    context, String receiverData) at Microsoft.SharePoint.SPEventManager.<>c__DisplayClassc1.b__6() at Microsoft.SharePoint.SPSecurity.RunAsUser(SPUserToken userToken, Boolean bResetContext, WaitCallback code, Object param) at Microsoft.SharePoint.SPEventManager.InvokeEventReceivers[ReceiverType](SPUserToken
    userToken, Gu...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ...id tranLockerId, RunEventReceiver runEventReceiver, Object receivers, Object properties, Boolean checkCancel) at Microsoft.SharePoint.SPEventManager.InvokeEventReceivers[ReceiverType](Byte[]
    userTokenBytes, Guid tranLockerId, RunEventReceiver runEventReceiver, Object receivers, Object properties, Boolean checkCancel) at Microsoft.SharePoint.SPEventManager.HandleEventCallback[ReceiverType,PropertiesType](Object callbackData) at Microsoft.SharePoint.Utilities.SPThreadPool.WaitCallbackWrapper(Object
    state) at System.Threading.ExecutionContext.runTryCode(Object userData) at System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) at System.Threading.ExecutionContext.Run(ExecutionContext
    execu...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database fa44 High ...tionContext, ContextCallback callback, Object state) at System.Threading._ThreadPoolWaitCallback.PerformWaitCallbackInternal(_ThreadPoolWaitCallback tpWaitCallBack)
    at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback(Object state)
    02/17/2014 16:29:03.26 w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzku High ConnectionString: 'Data Source=PFC-SQLUAT-202;Initial Catalog=TWINSDMS_LondonDivision_Content;Integrated Security=True;Enlist=False;Asynchronous Processing=False;Connect
    Timeout=15' ConnectionState: Open ConnectionTimeout: 15
    02/17/2014 16:29:03.26 w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzkv High SqlCommand: 'DECLARE @@iRet int;BEGIN TRAN EXEC @@iRet = proc_WriteChunkToAllDocStreams @wssp0, @wssp1, @wssp2, @wssp3, @wssp4, @wssp5, @wssp6;IF @@iRet <> 0 GOTO
    done; DECLARE @@S uniqueidentifier; DECLARE @@W uniqueidentifier; DECLARE @@DocId uniqueidentifier; DECLARE @@DoclibRowId int; DECLARE @@Level tinyint; DECLARE @@DocUIVersion int;DECLARE @@IsCurrentVersion bit; DECLARE @DN nvarchar(256); DECLARE @LN nvarchar(128);
    DECLARE @FU nvarchar(260); SET @DN=@wssp7;SET @@iRet=0; ;SET @LN=@wssp8;SET @FU=@wssp9;SET @@S=@wssp10;SET @@W=@wssp11;SET @@DocUIVersion = 512;IF @@iRet <> 0 GOTO done; ;SET @@Level =@wssp12; EXEC @@iRet = proc_UpdateDocument @@S, @@W, @DN, @LN, @wssp13,
    @wssp14, @wssp15, @wssp16, @wssp17, @wssp18, @wssp19, @wssp20, @wssp21, @wssp22, @wssp23, @wssp24, @wssp25, @wssp26,...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzkv High ... @wssp27, @wssp28, @wssp29, @wssp30, @wssp31, @wssp32, @wssp33, @wssp34, @wssp35, @wssp36, @wssp37, @wssp38, @wssp39, @wssp40, @wssp41, @wssp42, @wssp43, @wssp44, @wssp45,
    @wssp46, @wssp47, @wssp48, @wssp49, @wssp50, @wssp51, @@DocId OUTPUT, @@Level OUTPUT , @@DoclibRowId OUTPUT,@wssp52 OUTPUT,@wssp53 OUTPUT,@wssp54 OUTPUT,@wssp55 OUTPUT ; IF @@iRet <> 0 GOTO done; EXEC @@iRet = proc_TransferStream @@S, @@DocId, @@Level,
    @wssp56, @wssp57, @wssp58; IF @@iRet <> 0 GOTO done; EXEC proc_AL @@S,@DN,@LN,@@Level,0,N'London/Broking/Documents/E/E _ E Foods Corporation',N'2012',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,1,N'London/Broking/Documents/E',N'E _ E Foods Corporation',72,85,83,1,N'';EXEC
    proc_AL @@S,@DN,@LN,@@Level,2,N'London/Broking/Documents/E/E _ E Foods Corporation',N'2013',72,85,...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzkv High ...83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,3,N'London/Broking/Documents/E/E _ E Foods Corporation/2013',N'QA11G029601',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,4,N'London/Broking/Documents/K',N'Konig
    _ Reeker',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,5,N'London/Broking/Documents/K/Konig _ Reeker',N'2012',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,6,N'London/Broking/Documents/K/Konig _ Reeker/2012',N'QA12E013201',72,85,83,1,N'';EXEC proc_AL
    @@S,@DN,@LN,@@Level,7,N'London/Broking/Documents/K/Konig _ Reeker/2012',N'A12EL00790',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,8,N'London/Broking/Documents/K/Konig _ Reeker/2012',N'A12DA00720',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,9,N'London/Broking/Documents/K/Konig
    _ Reeker/2012',N'A12DC00800',72,85,83,1,N'';EXEC proc...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzkv High ..._AL @@S,@DN,@LN,@@Level,10,N'London/Broking/Documents/A',N'Ace European Group Limited',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,11,N'London/Broking/Documents/A/Ace
    European Group Limited',N'2012',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,12,N'London/Broking/Documents/A/Ace European Group Limited/2012',N'JXB88435',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,13,N'London/Broking/Documents/A/Ace European Group
    Limited/2012/JXB88435/Closings',N'PRM 1',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,14,N'London/Broking/Documents/A/Ace European Group Limited/2012/JXB88435/Closings',N'PRM 2',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,15,N'London/Broking/Documents/C',N'C
    Moore-Gordon',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,16,N'London/Broking/Documents/C/C Moore-Gordo...
    02/17/2014 16:29:03.26* w3wp.exe (0x10D0) 0x0DB0 SharePoint Foundation Database tzkv High ...n',N'2012',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,17,N'London/Broking/Documents/C/C Moore-Gordon/2012',N'QY13P700201',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,18,N'London/Broking/Documents/C/C
    Moore-Gordon',N'2013',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,19,N'London/Broking/Documents/C/C Moore-Gordon/2013',N'Y13PF07010',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,20,N'London/Broking/Documents/A/Ace European Group Limited/2012/JXB88435/Closings',N'ARP
    7',72,85,83,1,N'';EXEC proc_AL @@S,@DN,@LN,@@Level,21,N'London/Broking/Documents/A/Ace European Group Limited/2012/JXB88435/Closings',N'ARP 8',72,85,83,1,N'';EXEC proc_AL . . .
    Thanks in advance A

    SharePoint and SQL Server installed on same server or how is the setup?
    i would start to enable the developer dashboard, analyze the report of the developer dashboard...
    you will see if any webpart, or page or sql server query taking too much time.
    http://www.sharepoint-journey.com/developer-dashboard-in-sharepoint-2013.html
    Please remember to mark your question as answered &Vote helpful,if this solves/helps your problem. ****************************************************************************************** Thanks -WS MCITP(SharePoint 2010, 2013) Blog: http://wscheema.com/blog

  • The remote server returned an error: (401) Unauthorized

    We are attempting to upload a file to SharePoint on Office 365 and get the following error:
    The remote server returned an error: (401) Unauthorized.
    We also tried using NetworkCredentials and that got the following error: The remote server returned an error: (404) Not Found.
    The code is shown below.
    Any idea on to fix this error?
    public
    void uploadDocument(string
    siteurl, string username,
    SecureString securepassword,
    string domain,
    string filestreamPath =
    "NewDocument.docx",
    string serverRelativeUrl =
    "/Shared Documents/NewDocument.docx")
    ClientContext context =
    new
    ClientContext(siteurl);
            //context.Credentials = new NetworkCredential(username, securepassword, domain);
    context.Credentials = new
    SharePointOnlineCredentials(username, securepassword);
    using (FileStream
    fileStream =
    new
    FileStream(filestreamPath,
    FileMode.Open))
    ClientOM.File.SaveBinaryDirect(context,
    serverRelativeUrl, fileStream, true);

    Yes, it has access.  In fact, it has no problem getting the site properties using the code below.  It is just when I am uploading a file that the error occurs.
    public Web getProperties(string
    siteurl, string username,
    SecureString securepassword)
    Web web = null;
    // Starting with ClientContext, the constructor requires a URL to the
    // server running SharePoint.
    ClientContext context = new ClientContext(siteurl);
    // The SharePoint web at the URL.
    web = context.Web;
    // We want to retrieve the web's properties.
    context.Load(web);
    // Execute the query to the server.
    context.Credentials = new SharePointOnlineCredentials(username, securepassword);
    context.ExecuteQuery();
    return web;

  • SharePoint List Error :[DataSource.Error] SharePoint: Request failed: The remote server returned an error: (500) Internal Server Error. (An error occurred while processing this request.)

    When I connect to the SharePoint site that contains the lists I need to build my query from , Power Query enumerates the list and displays them in the tool. I can click on the system tables and view the records but any table
    in the list I created returns this  error.
    I can connect fine with InfoPath
    [DataSource.Error] SharePoint:   Request failed: The remote server returned an error: (500) Internal Server   Error. (An error occurred while processing this request.)
    thank you for your help
    Andrew
     

    Hi Andrew. In order for us diagnose this issue, you'll need to capture some network traces using a tool such as Fiddler and share them by sending a Frown.
    To capture a trace using Fiddler, start Fiddler, enable the Tools > Fiddler Options > HTTPS > Decrypt HTTPS traffic option, start the capture, reproduce your issue, then stop and save the capture. You can find more information here.
    Once you've done that, please send a Frown through the Power Query ribbon and attach the traces.
    Thanks,
    Ehren

  • How-to install JRE/JDK on remote server without X11

    Hi,
    I need to install a JRE on my remote server (e.g. Amazon EC2) from a location with slow / limited bandwidth.
    Ok, so I cobble a way to get a download of the bin by fudging that I am going to load it to my laptop, then copy and paste the final URL for the bin into WGET on my amazon instance - (Why oh why cant SUN just give me an URL directly to the latest JDK/JRE) e.g.
    wget java.sun.com/downloads/jre-latest.bin
    Ok - so that took 3.3 seconds at 48Mb/s - great.
    chmod .u+x j*
    ./j*
    The box goes into the black hole of death (probably looking for a response on a non existant X
    Ok - so what am I doing wrong? Is this the correct analysis?
    Thanks
    Chris

    Are you sure that's a JavaFX question?
    You might want to ask it in a forum with people more knowledgeable to this domain.

  • Time Capsule-  does it or NOT work as remote server?  Not seeing benefits I was sold by the salesperson.

    Hi!  I'm hoping you can help eliminate my frustrations....
    I purchased a macbook pro 13 inch because my other macbook 2009 was already low on memory and I thought it best to upgrade.  I needed more space but wanted also the ability to access remotely and not worry about computer being stolen with all the information on it because I travel A LOT.
    I asked the salesperson if I purchased the 128g MBP and the time capsule would i be able to access all my information remotely?  The thought process was:
    1. With over 500GB already in my previous MB i would transfer all that information from the old MBP to the time capsule where it would be at home safe
    2. set up the time capsule to access all files remotely (including iphoto, itunes, videos, etc)
    Because my information already was over 500Gb, buying the 2TB was going to be plenty of space.  So I went with purchasing the MBP with lower space (128G) + the 2TB Time Capsule.
    Now I'm trying to set everything up, was on the phone for 2.5 hours, spoke to 5 reps and finally received an answer: CAN'T DO IT.  DOESN'T WORK THAT WAY.
    I kept searching online and see that
    1. the Time Capsule can only be used as a time machine for backups.  Can't use as a remote server but others say you set up a server file and works.
    2. I can't place the iphoto library or itunes library there to access remotely because I risk it being corrupted (sales person said it would run slow but worked well)
    SO, the big question...can SOMEONE please help me determine if this was worth the $350 purchase because right now it feels like i'm better off sticking with my external drive from WD that i have been working with.  I'm seeing ZERO value add for this device....
    PLEASE...if someone knows if i can really get it to work for what I need, I would truly appreciate your advice because right now FRUSTRATION is at it's max...especially after 2.5 hours on the phone with little to no solutions.

    Some of this depends on your home internet speed.. do you have a fancy 1Gbit link.. with 100Mbps uplink.. well that is faster than rest of the internet.. so you can well support some sort of home cloud device.
    I would also strongly recommend business level internet accounts with static Public IP address.. not dynamic.
    For most people with limited upload speed.. 1mbit is standard where I live.. and real throughput is 800Kbps. Forget it.
    I would never advise anyone to depend on remote access.. for anything with 1Mbps upload.. and even 5Mbps is going to be a struggle.. Maybe 10 or 25Mbps you can function.
    It is simply not the right way to do things. You are far better off carrying around a 500GB thunderbolt drive.. which weighs next to nothing and is as fast as the internal drive.. or use a standard cheap 2.5" USB hard disk which you can get up to 2TB now.
    Backup to TC but not by Time Machine.. or backup to the cloud.. for sure..
    You can also access some files in the cloud. That is much better than using a TC at home.. where you then have a domestic quality link to the internet.
    Apple specifically say not to use iphoto on network drives.. that means any network setup.
    iTunes you can use from a network drive.. but if you have a massive library and slow upload speed it will never finish loading.
    The internet for the most part is just not there yet.. maybe another 20years. In Australia that will be 100years with the current bunch of Philistines running the place. And if you do international travel and depend on networks in hotels or cafes.. well forget it.. they are going to give you enough bandwidth to load your email as long as it is pure text.
    Apple have a 2week return period.. just take it back if you are unhappy with it.

Maybe you are looking for

  • ITunes 6.0.5 Upgrade Recognizes ipod as "disk"

    S.O.S. After a successful upgrade to itunes 6.0.5, itunes now recognizes my ipod as "disk" with all my files missing. If I unplug my ipod from my computer, all the files are present and playable. I've tried updating ipod firmware, uninstalling/reinst

  • VGA mode settings affect Chromium tabs text

    Hi, This sounds incredible. Maybe, supernatural is a better word. The game, step by step: (1) Onboard Intel video card. Everything's fine, thanks to KMS, probably. I never have had to touch anything: all the resolutions, grub, X or no X, correct righ

  • Oracle SQL Developer 1.2.1

    I can't modify values in SQL worksheet view. I have all the correct privileges but my data grids remain uneditable ?? Does anyone know why my data grids are not modifiable ?

  • Add a new tab next to "Basic data 2" in MM01

    Hi All, I have to create a new custom tab after u201CBasic Data 2u201D view of Material master Tcode (MM01), I could successfully add the new tab (Using SPRO), but it is being created at the end of all views i.e. after u201CCosting 2u201D tab. Kindly

  • Adobe AIR + iFrame + Java

    Hello! I am creating a simple application, that will allow my friends to access a Java-based website through a desktop application. And, I chose to use Adobe AIR, because it's very easy to use and also because I already know HTML, CSS and some Javasc