REST API and best bets / query rules

Is there a bug in this?  I can not get best bet results regardless of the query string.  For example, the following (according to MSDN documentation) should process the query rules and return a best bet.  But it doesnt.  Am I doing something
wrong in the query?  It only seems to return ordinary results.  Performing the same search via Sharepoint's UI returns the best bet at the top.
$.ajax({
url: "http://myserver/sites/mysite/_api/search/query/?processbestbets=true&enablequeryrules=true&querytext='" + searchText + "'",
type: 'GET',
headers: {
"accept": "application/json; odata=verbose"
success: myCallback,
error: function(data) {
alert(data.responseXML.text);

Hi Scott,
With EnableQueryRules and ProcessBestBets set to true in the search rest api, we can get the promoted results(best bets) in SharePoint 2013.
I recommend to type the search rest api URL in the browser directly and then check if the best bets are displayed at the top of the search results.
It will not displayed inside the table for normal results.
Please also make sure that the query text matches the query rule condition.
Thanks,
Victoria
TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
[email protected]

Similar Messages

  • SharePoint REST API and Asset Library Problem

    Hi dear friends,
    Im having an issue about posting videos via rest to asset library and i couldnt solve yet.
    My main task is to migrate a content(according to the scenario its video) from one farm to another.For that i've planned to use Rest API using server side(c#).While
    any other kind of document types(images,texts) are migrated successfully,videos are giving me hard time.
    I think theres something special going on while creating videos in an asset library,right?
    When i post video as byte array it creates video as .mp4 file but the content type displayed as "Image".I followed this guide : https://msdn.microsoft.com/en-us/library/office/dn292553.aspx and
    couldnt find a way.
    Can someone give me a sample code or help me solve this issue? Looking forward help from SharePointers who worked with rest api :)
    Thanks

    You can look at this link to help you set the metadata using REST.
    http://sharepointfieldnotes.blogspot.com/2014/04/uploading-documents-and-setting.html
    Blog | SharePoint Field Notes Dev Tools |
    SPFastDeploy | SPRemoteAPIExplorer

  • Office 365 Files REST API and chunked upload

    Hello,
    I am currently working on integration with OneDrive for Business and I need to upload large files in chunks. I need to do something like  this but using
    the O365 Files REST API. As far as I understand this is currently not supported.
    Are there any plans to support
    uploading in chunks? If yes, can you please provide some (rough) estimate?
    Thanks
    Andrey

    Thank you for the quick answer!
    We are trying to upload the files using javascript and we are currently working with this API:
    https://msdn.microsoft.com/en-us/office/office365/api/files-rest-operations
    We have also noticed that the roadmap for Office 365
    http://roadmap.office.com/en-us contains SharePoint API Partial File Upload Support. 
    Does the update pointed in the roadmap refer to the API we are using? Or if you are saying we can already do chunked upload using it, can you please point us to a working sample?
    Thanks
    Andrey

  • Mapping and querying Custom Objects for a Contact with REST Api

    Hello All,
    We are hoping to get some details on managing DataCard set through REST APIs. Our implementation goal is to create Contacts and add Custom object for each Contact, or to be precise, add a DataCard Set for each Contact.
    At the moment, to associate a DataCard Set (or Custom Object) to an existing contact, we are supplying following custom object fields during creation of Custom Object:
    new CustomObjectField 
                                                                    name = "MappedEntityType",
                                                                    dataType = Enum.GetName(typeof(DataType), DataType.numeric),
                                                                    type = "CustomObjectField",
                                                                    defaultValue = "0"
                                                             new CustomObjectField
                                                                    name = "MappedEntityID",
                                                                    dataType = Enum.GetName(typeof(DataType), DataType.numeric),
                                                                    type = "CustomObjectField",
                                                                    defaultValue = "<ContactId>"
    Is this the correct approach? This is Based on the information provided here: http://topliners.eloqua.com/community/code_it/blog/2012/05/31/eloqua-api-how-to-mapping-a-data-card-to-an-entity.
    Would the REST API allow us to query the CustomObjects using the MappedEntityId value for later updates? If so, any pointers on how we approach that?
    Thanks in ad.

    Either the MappedEntityID field is not available or I do it wrong, Eloqua is ignoring the field and does not map the custom record with the unique Contact ID
    {"type":"CustomObjectData","ContactID":"8829509","fieldValues":[{"id":"195","value":"[email protected]"},{"id":"220","value":"a0KJ000000387QvMAI"},{"id":"191","value":"001J000001OrL77IAF"},{"id":"193","value":"NowTV MPP"},{"id":"194","value":"8829509"},{"id":"196","value":"Andreas"},{"id":"197","value":"Wolf"},{"id":"198","value":"003J00000145lkBIAQ"},{"id":"210","value":"777666555"},{"id":"199","value":"gbp"},{"id":"200","value":"0"},{"id":"215","value":"0"},{"id":"201","value":"999111999"},{"id":"214","value":"111111"},{"id":"202","value":"222222"},{"id":"204","value":"now"},{"id":"203","value":"xmas"},{"id":"205","value":"no description"},{"id":"206","value":"test"},{"id":"218","value":"holidays"},{"id":"219","value":"PPV-0878545"},{"id":"213","value":"N"},{"id":"212","value":"myself"},{"id":"209","value":"now tv"},{"id":"192","value":"1417542120"},{"id":"207","value":"1417542120"},{"id":"216","value":"1417542240"},{"id":"217","value":"1417542240"},{"id":"211","value":"1417542240"}]},"MappedEntityID":"003J00000145lkBIAQ"}
    Response
    DEBUG|Response------{"type":"CustomObjectData","id":"81720","fieldValues":[{"id":"195","value":"[email protected]"},{"id":"220","value":"a0KJ000000387QvMAI"},{"id":"191","value":"001J000001OrL77IAF"},{"id":"193","value":"NowTV MPP"},{"id":"194","value":"8829509"},{"id":"196","value":"Andreas"},{"id":"197","value":"Wolf"},{"id":"198","value":"003J00000145lkBIAQ"},{"id":"210","value":"777666555"},{"id":"199","value":"gbp"},{"id":"200","value":"0"},{"id":"215","value":"0"},{"id":"201","value":"999111999"},{"id":"214","value":"111111"},{"id":"202","value":"222222"},{"id":"204","value":"now"},{"id":"203","value":"xmas"},{"id":"205","value":"no description"},{"id":"206","value":"test"},{"id":"218","value":"holidays"},{"id":"219","value":"PPV-0878545"},{"id":"213","value":"N"},{"id":"212","value":"myself"},{"id":"209","value":"now tv"},{"id":"192","value":"1417542120"},{"id":"207","value":"1417542120"},{"id":"216","value":"1417542240"},{"id":"217","value":"1417542240"},{"id":"211","value":"1417542240"}]}
    Eloqua:
    Name: PPV-0878545
    Unique Code: a0KJ000000387QvMAI
    Status Registered
    Created Date 12/22/2014 12:44:49 PM
    Mapped NO
    Any Idea how to map this to a contact
    Entity Type is Contacts
    Entity Field is SFDC Contact ID

  • What is the best way to get storage data for hard disk using REST api

    Hello All,
    Given that I have disk info for virtual machine/role from service management REST api (for example using
    GetRole) how I could retrieve container/blob related info for it?
    So I have credentials for service management REST API, I have OSVirtualHardDisk info, but I am not sure how to detect correctly to which storage account connect and than which container to use. Yes, I know that there is OSVirtualHardDisk .MediaLink property
    which contains storage account name and container inside of it but I am not sure it is good practice to assume about it format. Alternatively I have another solution - just retrieve all storage accounts from  Service management REST, then compare url
    of each account with disk's  MediaLink. And use appropriate storage account for further data retrieve.   But seem to me it could retrieve too many info. 
    So generally I am trying to find correct way to join  service management REST api and Storage Services REST API for disks

    Hi,
    From my experience, your first approach is correct. The media link exactly points to the location of the blob. With the link, you can access the blob if you have the storage account key. If you want to extract more information, such as what
    the container is and what the blob is, you can parse the link. 
      >> From my point of view it is bad way to retrieve storage account name and container.
    In addition, you are welcome to post feature requests on
    http://feedback.windowsazure.com/forums/34192--general-feedback
    Best Regards,
    Ming Xu
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Use REST API to query contacts with a field updated since certain time?

    Hello,
    Using the REST API and trying to figure out how I can query for contacts that have had a field updated since a specified time. I've tried using the "lastUpdatedAt" arg, but to me it seems like I'm getting back contacts that maybe haven't had any data change, but were probably sent an email or something of that sort (so the updated at value was changed)
    Is there a way to query for contacts that have had fields change (Like subscriptions, address, title, etc) but exclude those contacts that only have their updated values changes because they were sent an email?
    Thanks!

    Hi Chris,
    lastUpdatedAt or "Date Modified" fields don't change when the contact is sent an email, but rather, when ANY field is modified on the contact record. You can certainly query for and export contacts that had their data touched in some way since a specified time, but it won't be on a per-field basis. There is effectively no field level change history or tracking. You can work around this with extra logic. First, you can get a snapshot of what the values were in the specific fields you want to track across your entire database.
    The next time you run an export using Date Modified, it will contain more records than you might care about, but you can filter offline for the ones you do care about by comparing their before and after values for the specific 'tracked' fields.
    Regards,
    Bojan

  • How to filter the Rest Api data based on Taxanomy columns

    Hi Everyone,
    We are using SharePoint2010 Standard Edition.
    I wanted get the library details through REST Api. I am using as below:
    https://SiteUrl/_vti_bin/listdata.svc/Documents?$filter=Title eq 'SharePointDoc'
    Here I am able to get the info regarding "SharePointDoc". But when I am trying to get the details from Taxonomy filter, it didn't.
    Can anyone please tell me how can we filter based on Taxanomy fields.
    Thanks in Advance
    Krishnasandeep

    Hi,
    I understand that you wanted to filter the Rest Api data based on Taxanomy columns.
    Per my knowledge, in SharePoint 2010 , not all types of column are available via REST, most annoyingly managed metadata columns are amongst this group of unsupported column types.
    However, in SharePoint 2013, we can filter list items based on taxonomy (managed metadata) columns.
    Taxonomy fields can be now called via REST API using CAML query in REST calls.
    Here is a great blog for your reference:
    http://www.cleverworkarounds.com/2013/09/23/how-to-filter-on-a-managed-metadata-column-via-rest-in-sharepoint-2013/comment-page-1/
    You’d better to change the REST calls and the CAML query to check whether it works in SharePoint 2010.
    More information:
    http://platinumdogs.me/2013/03/14/sharepoint-adventures-with-the-rest-api-part-1/
    Best Regards,
    Linda Li
    Linda Li
    TechNet Community Support

  • Sharepoint REST api - subtask filtering

    Hi - I am using REST to retrieve the contents of a sharepoint list (via ODATA)
    However, the results I am getting back include not only tasks but also subtasks - how can I filter out subtasks, as I'm not interested in them?  Thanks!

    Hi,
    According to your description, my understanding is that you want to get only parent tasks using Rest API.
    In the task list, there is a field named "ParentID" to distinguish the task and sub task, if the field is null then it is the parent task.
    http://<sitecollectionpath>/_api/Web/Lists/getByTitle('TaskListName')/Items/?$filter=ParentID eq null
    Here is a similiar thread for your reference:
    Tasks and SubTasks with CSOM
    More information:
    Querying a SharePoint 2013 Task List Rest Api
    Thanks
    Best Regards
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • How to get permission of a sharepoint list for a user using REST api

    Hi there,
    I have a requirement where i need to check the access permission of a user against a List or Library only using REST api from my remote salesforce app. [I already have access token and I am able to view list, add item etc..]
    Say for example, I have to send the list name and user name, and get the result as Read, Write, Contribute(Manage), None. I need to display what permission does that user have for that List.
    How do I achieve this. Please help me.
    Thanks in advance.

    Hi,
    For High and low bits, we can create a new SP.BasePermissions object to use like below:
    function success(data){
    var permissions = new SP.BasePermissions();
    permissions.set(SP.PermissionKind.manageLists);
    var hasPermission = permissions.hasPermissions(data.d.EffectiveBasePermissions.High, data.d.EffectiveBasePermissions.Low);
    Here is a detailed article for your reference:
    http://www.lifeonplanetgroove.com/checking-user-permissions-from-the-sharepoint-2013-rest-api/
    Thanks
    Best Regards
    Jerry Guo
    TechNet Community Support

  • APIM REST API - How to add an API with operations?

    When I issue a PUT to /apis/{aid} with import=true and contentType = application/json, I get an 400 BadRequest if the body contains operations.  Note that I'm
    not using WADL or Swagger format, and am building the JSON request body instead as per the documentation.  
    The error returned in the response body is:
    {"error":{"code":"ValidationError","message":"One or more fields contain incorrect values:","details":[{"code":"ValidationError","target":"operations","message":"Invalid
    field 'operations' specified"}]}}
    If I omit the operations field from the request body, the API is created correctly (but with no operations of course).  
    To illustrate, here is the RAW request (slightly modified to remove the authorization and instance names):
    PUT https://testinstance.management.azure-api.net/apis/123456?api-version=2014-02-14-preview&import=true HTTP/1.1
    Authorization: SharedAccessSignature . . .
    Content-Type: application/json; charset=utf-8
    Host: testinstance.management.azure-api.net
    Content-Length: 414
    Expect: 100-continue
    Connection: Keep-Alive
    {"name":"Customer","description":"This API is used to manage customers","serviceUrl":"http://www.somewebsite.com/customers","path":"customers","protocols":["http","https"],"operations":[{"id":"/apis/123456/operations/a04b16da-29b7-44f5-95a3-25603dbc9b6d","name":"Customer
    (Get)","description":"Returns information about the customer with the specified ID.","method":"GET","urlTemplate":"/customers/{customerId}"}]}
    The JSON looks correct to me - can anyone spot what I'm doing wrong?  Can you provide a RAW request that works?
    Note also that I've not included the "path" query string parameter, even though the documentation suggests that it is required when import=true.  Since there is no easy way to update APIs/Operations en masse, I'm having to resort to driving
    the APIM REST API and could use some additional documentation for this specific operation.

    Just updating this thread with the answer...my request body was malformed - I was using an array of objects for the operations field instead of a complex object representing the collection.
    So instead of this:
    "operations": [
    ...I should have been using this:
    "operations": {
        "value": [
    You can see the correctly formatted payload in the sample response body returned from the
    GetApi endpoint.
    Thanks to Vlad for pointing out my mistake!

  • Enabling Profiling for deployments via the REST API

    We have an application which is being deployed via the REST API. Our logging has captured some errors related to memory consumption that we'd like to examine using profiling. All the examples of profiling in Azure that I can find discuss deployment via Visual
    Studio which is not feasible in this instance.
    Is there a way to enable profiling via the REST API and if so are there any examples or documentation for this? We're using Linq2Azure to access the APIs but we can extend it if necessary.
    Thanks
    Colin

    Hi,
    Yes, we can choose enable profiling when we publish an application via visual studio, from my experience, this feature was provided by using the Visual Studio profiler, if we didn’t use visual studio to deploy the application, it is hard to enable profiling.
    Refer to
    http://msdn.microsoft.com/en-us/library/windowsazure/hh369930.aspx#BK_ProfilingCloudService for more information.
    Best Regards

  • Search REST API - Usage reports

    Hi,
    We use the Search REST API to execute queries. 
    The Usage reports on the Site Collection under Popularity and Search Reports do not contain any data.
    Is there a way to get usage data for queries executed by the REST API?
    Best regards,
    Christoffer Vig

    You suggest running queries directly against the SP log database? I see it described at http://technet.microsoft.com/en-us/library/hh769359(v=office.14).aspx as well.
    I always thought this was strictly prohibited in a production environment and will void warranty, but is the logging database an exception to this?
    Thanks,
    Mikael Svenson
    Search Enthusiast - SharePoint MVP/MCT/MCPD - If you find an answer useful, please up-vote it.
    http://techmikael.blogspot.com/
    Author of Working with FAST Search Server 2010 for SharePoint

  • ADF BC: integrating REST API

    Using JDev 11g PS4
    The model of my application has a mix of different sources. Some entities are stored in my database.
    Other entities are coming from a call to an external REST API.
    I am wondering what the best solution is to integrate those REST API objects into my model project because I don't want to expose them as rest api to my view. In my view project, I want to hide the complexity of those API's by exposing data controls.
    Now, what is the best way?
    1) Creating POJO's and creating data controls out of them. These pojo's will have a signature like this:
    public class Foo {
       public List<FooObject> getFoos();
       public void addFoo(FooObject foo);
       public void removeFoo(FooObject foo);
    }This will be exposed to my view so I can easily use it in my view project.
    2) Create view objects to map the entities from my rest API and implement the REST API calls in the custom implementation of the view objects.
    3) ...
    Any other suggestion is more then welcom.

    Hi,
    if the rest of your model is ADF BC then it would feel natural to use prgrammatic view objects to access the REST service. However, if the data is unrelated then using a a POJO DC also appears to be a solution. For simple REST requests that return XML formatted data, you could use the URL Data Control which in 11g R1 allows for read access and in 11.1.2 the full REST CRUD cycle
    Frank

  • How to generate the User-Input XML Body for executing workflows via REST APIs: The Solution

    I see that executing a workflow via REST APIs requires lot of work to be done just to prepare the right User-input XML body. Any mistake and you have some major debugging to do. Larger the number of User-Inputs, the bigger is the problem.Life is so much easier at the WFA GUI with Display names and tooltip help for User Inputs which are very easy for reading and providing the right values. I don't have any such privileges when manually preparing the User-Input XML body.It’s been asked numerous times how to provide User-Input values for type table, or Query (Multi-Select) etc. These are complex User-Input types and has lots of scope for user mistakes.I can have User-input dependency at WFA GUI which allows me to make the right selection, but while preparing my XML body I need to take care of it myself.An operator is allowed to execute workflows, but the same Display names which help him make the right user-inputs, makes it impossible for him to prepare the user-input body xml. Display names can't be used in in XML body and he can't know the exact parameter names by looking at the Display names. So he need to always contact the Admins/Architects for this. And Architects/Admins can't be expected to keep providing User-Input XML body to operators every operator. How about if I could enter all the User-Input values in my workflow execution at WFA GUI, I can do a preview which passed to my satisfaction and then I can magically get the XML body for it which I can use to execute my workflow from REST APIs from any client. It could be so very much easy for me than building my User-Input XML body manually. This is exactly what I'm going to give you right now. You open the WFA in browser, Go to your workflow, Start execution, you input values from GUI reading carefully the display names, preview it to your satisfaction and then get the XML body. Assume your workflow is called “Workflow to Print a given Message”. It’s a simple workflow with only 1 user-input Displayed as "Message to Print" Prerequisites:  The following are the one-time prerequisites. You need PowerShell 3.0 on your WFA server.Import the attached Generate_Workflow_User_Input_Body_in_XML.dar in your WFA. It’s our magical command called "Generate Workflow User Input Body in XML"Add credentials of a WFA Admin/Architect in you WFA itself with Name/IP: localhostMatch: ExactType: OtherName/IP: localhostUsername: <WFA Admin/Architect Username>Password: <User Password>   Steps: Suppose you have a workflow called "Workflow to Print a given Message". You want to execute it from REST apis and need to prepare the user input XML body.  Select this workflow and clone it. The workflow clone is the exact copy of your original workflow word by word, input-by-input. It will open in Edit mode with name "Workflow to Print a given Message - copy".Add the command "Generate Workflow User Input Body in XML" at the beginning of your workflow. This is a must. This command need to be the first command in your cloned workflow.This command requires no input. So for its Parameters just press okay and save the workflow.You are done.Now Execute the clone workflow. You'll see all the user-inputs available to you. Make your choices as you wish. Preview it to confirm that planning is passed and u have no errors.Execute it now.You'll see that the our magical command "Generate Workflow User Input Body in XML" has failed in our clone workflow execution. Don't worry, its fate was decided to be so. But it didn't fail before giving me what I really wanted. i.e. my XML body for my real workflow. It displayed it in the GUI as well as saved it in your WFA server @ C:\temp\<workflow_name_dd_MM_yyyy_hh_mm_ss_.xmlIt also deleted all the reservations of this particular failed job. So NO major residue left to be cleaned.To summarize: Clone Your workflow and Add the command "Generate Workflow User Input Body in XML" as your first command.    Start Execution, provide your User-inputs and preview it. Be satisfied and Press Okay.   Now Execute it.  After a few scconds this cloned workflow will fail with Error "All done. The Workflow will fail now."     See the command execution logs for this command. You'll see the User-Input XML body. It has also saved the XML file at C:\temp in your WFA server.   Have fun. sinhaa  

    Providing a new version 1.1.0 of the command "WFA Schedular" Changes made: Added conditional String Representation based on the Scheduling parameter provided. Provided check for the right number of parameters passed into the command.Added a new parameter "Expiry Date" to automatically stop the recurring execution upon expiry.Check for Posh3.0 version in code.Have Fun!! sinhaa Below example for:Schedule a workflow for recurring execution every alternate day i.e. once in 2 days at 10:30 PM starting 06-Jul-2015 (Today's date is 02-Jul-2015) . The recurring workflow execution  should expire on 31-Dec-2015 and stop.  

  • REST API sync issue

    Hi expert,
      I created a Excel file into SharePoint 2013 Excel library and Post the URL into MS word quick parts of IncludePicture. in MS Word, I can get the Excel Chart. but when I update the chart and publish into SharePoint again, the word file does not change.
    does anybody knows how to sync both ?
    the URL like this,
    http://www.sharepointsite.com/_vti_bin/excelrest.aspx/Excel%20Library/TeamTasks_data.xlsx/model/
    charts('Task%20Status')?$format=image
    Thanks
    James Liang

    Hi,
    There’s a setting in your Trusted File Locations (in the configuration of the Excel Service Application) that you have to check, in order to have the REST API update the connections.
    http://www.sharepointblogs.be/blogs/vandest/archive/2014/02/20/excel-rest-api-not-refreshing-data.aspx
    If the issue still exists, please check whether you have select "Data not stored with document" in the "Field options".
    http://blogs.office.com/2009/11/09/excel-services-in-sharepoint-2010-rest-api-examples/
    Best Regards
    Dennis Guo
    TechNet Community Support

Maybe you are looking for