Paging results from openSearch source

Hi
I have created a result source and connected it to an aspx page that I have created which gets a search phrase as parameter and renders results in OpenSearch 1.1 protocol.
Im displaying the results in a search center with OOTB search webparts.
Basically everything works ok besides 2 things:
1. The search results paging doesnt work. when I click on a page in the paging links i can see the request arrives to my aspx page but no indication for the requested page is shown
2. The search results from the result source i have created are halted if it takes more than 15 seconds to render the results, is there any way to change this interval? from the log i can see its a webrequest timeout but couldnt find the place to modify
it.
Thanks in advance,
Yuval

I'm not sure about paging, but it doesn't look like the Core Results Web Part allows you to set the timeout limit. If you were returning the results via code, you could use the following property:
http://msdn.microsoft.com/en-us/library/microsoft.office.server.search.query.querymanager.timeout(v=office.15).aspx?cs-save-lang=1&cs-lang=csharp#code-snippet-1
Dimitri Ayrapetov (MCSE: SharePoint)

Similar Messages

  • Can we merge data from multiple sources in Hyperion Interactive Reporting ?

    Hi Experts,
    Can we merge data from multiple sources in Hyperion Interactive Reporting ?Example can we have a report based on DB2
    Oracle,
    Informix and Multidiemnsional Databases like DB2,MSOLAP,ESSBASE?
    Thanks,
    V

    Yes, Each have their own Query and have some common dimension for the Results Sections to be joined together in a final query.
    look in help for Creating Local Joins

  • ADFS SSO and SharePoint 2013 on-premise Hybrid outbound search results from SharePoint Online - does it work?

    Hi, 
    I want to setup an outpund hybrid search for SharePoint 2013 on-premise to SharePoint Online.
    But I'm not shure if this works with ADFS SSO.
    Has somebody experience with this setup?
    Here's my guide which I'm going to use for this installation:
    Introduction
    In this post I'll show you how to get search results from your SharePoint Online in your SharePoint 2013 on-premise search center.
    Requirements
    User synchronisation ActiveDirectory to Office 365 with DirSync
    DirSync password sync or ADFS SSO
    SharePoint Online
    SharePoint 2013 on-premise
    Enterprise Search service
    SharePoint Online Management Shell
    Instructions
    All configuration will be done either in the Search Administration of the Central Administration or in the PowerShell console of your on-premise SharePoint 2013 server.
    Set up Sever to Server Trust
    Export certificates
    To create a server to server trust we need two certificates.
    [certificate name].pfx: In order to replace the STS certificate, the certificate is needed in Personal Information Exchange (PFX) format including the private key.
    [certificate name].cer: In order to set up a trust with Office 365 and Windows Azure ACS, the certificate is needed in CER Base64 format.
    First launch the Internet Information Services (IIS) Manager
    Select your SharePoint web server and double-click Server Certificates
    In the Actions pane, click Create Self-Signed Certificate
    Enter a name for the certificate and save it with OK
    To export the new certificate in the Pfx format select it and click Export in the Actions pane
    Fill the fields and click OK Export to: C:\[certificate
    name].pfx Password: [password]
    Also we need to export the certificate in the CER Base64 format. For that purpose make a right-click on the certificate select it and click on View...
    Click the Details tab and then click Copy to File
    On the Welcome to the Certificate Export Wizard page, click Next
    On the Export Private Key page, click Next
    On the Export File Format page, click Base-64 encoded X.509 (.CER), and then click Next.
    As file name enter C:\[certificate
    name].cer and then click Next
    Finish the export
    Import the new STS (SharePoint Token Service) certificate
    Let's update the certificate on the STS. Configure and run the PowerShell script below on your SharePoint server.
    if(-not (Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue)){Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
    # set the cerficates paths and password
    $PfxCertPath = "c:\[certificate name].pfx"
    $PfxCertPassword = "[password]"
    $X64CertPath = "c:\[certificate name].cer"
    # get the encrypted pfx certificate object
    $PfxCert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $PfxCertPath, $PfxCertPassword, 20
    # import it
    Set-SPSecurityTokenServiceConfig -ImportSigningCertificate $PfxCert
    Type Yes when prompted with the following message.
    You are about to change the signing certificate for the Security Token Service. Changing the certificate to an invalid, inaccessible or non-existent certificate will cause your SharePoint installation to stop functioning. Refer
    to the following article for instructions on how to change this certificate: http://go.microsoft.com/fwlink/?LinkID=178475. Are you
    sure, you want to continue?
    Restart IIS so STS picks up the new certificate.
    & iisreset
    & net stop SPTimerV4
    & net start SPTimerV4
    Now validate the certificate replacement by running several PowerShell commands and compare their outputs.
    # set the cerficates paths and password
    $PfxCertPath = "c:\[certificate name].pfx"
    $PfxCertPassword = "[password]"
    # get the encrypted pfx certificate object
    New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $PfxCertPath, $PfxCertPassword, 20
    # compare the output above with this output
    (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
    [/code]
    ## Establish the server to server trust
    [code lang="ps"]
    if(-not (Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue)){Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
    Import-Module MSOnline
    Import-Module MSOnlineExtended
    # set the cerficates paths and password
    $PfxCertPath = "c:\[certificate name].pfx"
    $PfxCertPassword = "[password]"
    $X64CertPath = "c:\[certificate name].cer"
    # set the onpremise domain that you added to Office 365
    $SPCN = "sharepoint.domain.com"
    # your onpremise SharePoint site url
    $SPSite="http://sharepoint"
    # don't change this value
    $SPOAppID="00000003-0000-0ff1-ce00-000000000000"
    # get the encrypted pfx certificate object
    $PfxCert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $PfxCertPath, $PfxCertPassword, 20
    # get the raw data
    $PfxCertBin = $PfxCert.GetRawCertData()
    # create a new certificate object
    $X64Cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2
    # import the base 64 encoded certificate
    $X64Cert.Import($X64CertPath)
    # get the raw data
    $X64CertBin = $X64Cert.GetRawCertData()
    # save base 64 string in variable
    $CredValue = [System.Convert]::ToBase64String($X64CertBin)
    # connect to office 3656
    Connect-MsolService
    # register the on-premise STS as service principal in Office 365
    # add a new service principal
    New-MsolServicePrincipalCredential -AppPrincipalId $SPOAppID -Type asymmetric -Usage Verify -Value $CredValue
    $MsolServicePrincipal = Get-MsolServicePrincipal -AppPrincipalId $SPOAppID
    $SPServicePrincipalNames = $MsolServicePrincipal.ServicePrincipalNames
    $SPServicePrincipalNames.Add("$SPOAppID/$SPCN")
    Set-MsolServicePrincipal -AppPrincipalId $SPOAppID -ServicePrincipalNames $SPServicePrincipalNames
    # get the online name identifier
    $MsolCompanyInformationID = (Get-MsolCompanyInformation).ObjectID
    $MsolServicePrincipalID = (Get-MsolServicePrincipal -ServicePrincipalName $SPOAppID).ObjectID
    $MsolNameIdentifier = "$MsolServicePrincipalID@$MsolCompanyInformationID"
    # establish the trust from on-premise with ACS (Azure Control Service)
    # add a new authenticatio realm
    $SPSite = Get-SPSite $SPSite
    $SPAppPrincipal = Register-SPAppPrincipal -site $SPSite.rootweb -nameIdentifier $MsolNameIdentifier -displayName "SharePoint Online"
    Set-SPAuthenticationRealm -realm $MsolServicePrincipalID
    # register the ACS application proxy and token issuer
    New-SPAzureAccessControlServiceApplicationProxy -Name "ACS" -MetadataServiceEndpointUri "https://accounts.accesscontrol.windows.net/metadata/json/1/" -DefaultProxyGroup
    New-SPTrustedSecurityTokenIssuer -MetadataEndpoint "https://accounts.accesscontrol.windows.net/metadata/json/1/" -IsTrustBroker -Name "ACS"
    Add a new result source
    To get search results from SharePoint Online we have to add a new result source. Run the following script in a PowerShell ISE session on your SharePoint 2013 on-premise server. Don't forget to update the settings region
    if(-not (Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue)){Add-PSSnapin "Microsoft.SharePoint.PowerShell"}
    # region settings
    $RemoteSharePointUrl = "http://[example].sharepoint.com"
    $ResultSourceName = "SharePoint Online"
    $QueryTransform = "{searchTerms}"
    $Provier = "SharePoint-Remoteanbieter"
    # region settings end
    $SPEnterpriseSearchServiceApplication = Get-SPEnterpriseSearchServiceApplication
    $FederationManager = New-Object Microsoft.Office.Server.Search.Administration.Query.FederationManager($SPEnterpriseSearchServiceApplication)
    $SPEnterpriseSearchOwner = Get-SPEnterpriseSearchOwner -Level Ssa
    $ResultSource = $FederationManager.GetSourceByName($ResultSourceName, $SPEnterpriseSearchOwner)
    if(!$ResultSource){
    Write-Host "Result source does not exist. Creating..."
    $ResultSource = $FederationManager.CreateSource($SPEnterpriseSearchOwner)
    $ResultSource.Name = $ResultSourceName
    $ResultSource.ProviderId = $FederationManager.ListProviders()[$Provier].Id
    $ResultSource.ConnectionUrlTemplate = $RemoteSharePointUrl
    $ResultSource.CreateQueryTransform($QueryTransform)
    $ResultSource.Commit()
    Add a new query rule
    In the Search Administration click on Query Rules
    Select Local SharePoint as Result Source
    Click New Query Rule
    Enter a Rule name f.g. Search results from SharePoint Online
    Expand the Context section
    Under Query is performed on these sources click on Add Source
    Select your SharePoint Online result source
    In the Query Conditions section click on Remove Condition
    In the Actions section click on Add Result Block
    As title enter Results for "{subjectTerms}" from SharePoint Online
    In the Search this Source dropdown select your SharePoint Online result source
    Select 3 in the Items dropdown
    Expand the Settings section and select "More" link goes to the following URL
    In the box below enter this Url https://[example].sharepoint.com/search/pages/results.aspx?k={subjectTerms}
    Select This block is always shown above core results and click the OK button
    Save the new query rule

    Hi  Janik,
    According to your description, my understanding is that you want to display hybrid search results in SharePoint Server 2013.
    For achieving your demand, please have a look at the article:
    http://technet.microsoft.com/en-us/library/dn197173(v=office.15).aspx
    If you are using single sign-on (SSO) authentication, it is important to test hybrid Search functionality by using federated user accounts. Native Office 365 user accounts and Active Directory Domain Services
    (AD DS) accounts that are not federated are not recognized by both directory services. Therefore, they cannot authenticate using SSO, and cannot be granted permissions to resources in both deployments. For more information, see Accounts
    needed for hybrid configuration and testing.
    Best Regards,
    Eric
    Eric Tao
    TechNet Community Support

  • Search has encountered a problem that prevents results from being returned. If the issue persists, please contact your administrator.

    Hello Guys,
    I am creating resultsource from central admin. If I create it from central admin it works fine. But if I am creating result source from power shell scripts it shows me following error message.
    An exception of type 'Microsoft.Office.Server.Search.Query.InternalQueryErrorException' occurred in Microsoft.Office.Server.Search.dll but was not handled in user code
    Additional information: Search has encountered a problem that prevents results from being returned.  If the issue persists, please contact your administrator.
    Any suggestion ?
    Thanks in Advance.

    Hi,
    Please provide more specific information about the issue. What type of content source you tried creating via powershell?
    Make sure you are using the approproate permission and search service application.
    Here is the reference for creating content resource via script:
    http://technet.microsoft.com/en-us/library/ff607867(v=office.15).aspx
    Regards,
    Rebecca Tu
    TechNet Community Support

  • NW 7.3 + iView from remote source wizard + R3 Web Dynpro ABAP, not working

    Hi,
    I think this forum is my last resort.
    I'm running NetWeaver 7.3 Enterprise Portal and also have an existing ECC6 on backend.
    Our ABAP Programmer has built some WD4A application on ECC, I'd like to made them available
    on EP as ABAP WebDynpro iView.
    From the document, I know the way to create ABAP WD iView is different since 7.3. I have to use "iView from remote source" to do it.
    On EP side, I have made an system alias (test ok) for ECC, assigned permission, provide mapping id/pwd, and test. 
    On ABAP side, all webdynpro app are activated (T-code sicf, /sap/bc/webdynpro, /sap/public/bc/webdynpro...). I can even test WAS by bringing up browser and access some sample ABAP webdynpro .
    However, I cannot success on "iView from remote source" wizard and it always show me "Nothing found. Check your search Criteria.".   Eventhough I use "*" as search criteria.
    Anyone has the similar situation ? I even doubt that it is a placeholder of a feature , not working for now. Anyone made it work on NW 7.3 ?
    More detail:
    My System Alias for ECC:   
          Application Host: myr3.mydomain.com
          SAP Client: 600
          SAP System ID:   DEV
          SAP System Number: 00
          Server Port:  <leave it blank>
          System Type: SAP_R3
          Logon Method:  UIDPW
          User Mapping Type: admin,user
          ICM Host: myr3.mydomain.com:8000
          ICM Protocol: http
          ICM URL prefix: <leave it blank>
          SAP Netweaver AS Description: <leave it blank>
    All other fileds are leave it as default value
    I also assign a group to have "read" and "user" permission to this system
    Also assign Alias to the system.
    I also create a user belong to the group, assign his system mapping an ECC user ID & PWD to it.
    I tested the System alias, it is working (both WEB AS  test, and R3 test)
    Then I try to create an iView on EP, under a PCD folder , new->iview->iview from remote source. A wizard show up. Then I can select the ECC alias just created, and provide * as application search criteria, and then select "WEB DYNPRO ABAP" as the application type. Then "GO"
    as I mentioned, it always show "Nothing Found. Check your search crieteria"
    But I'm sure the web dynpro applications are activated on R3.
    OK, that's it, please help this one. Thanks
    IF YOU DID MAKE IT WORK ON YOUR NW 7.3, PLEASE DROP A LINE HERE SO I KNOW IT IS MY OWN PROBLEM !!!
    Edited by: Wilson KU on Nov 14, 2011 10:42 AM
    Edited by: Wilson KU on Nov 14, 2011 10:46 AM

    Thanks for the reply,
    [About Backend]
    I feel confident about the backend R3 system because I can use the WAS URL to access the sample WD4A application, for example, I can bring up the wd4a application ui in my browser. It is purely browser and R3 stuff, no EP involved.
    http://soeprdev.mydomain.com:8000/sap/bc/webdynpro/bobf/demo_sales_order?sap-client=600&sap-language=ZF
    I also already activate all nodes and sub-nodes (in SICF Services) under /sap/bc/webdynpro as well as /sap/public/bc/webdynpro. (and some other nodes to make icon and include work)
    [About the system object in EP]
    I have done all kind of test on the system object (alias):
    1. In EP System Administration -> Landscape Configuration, I select the system object, click "Test Connection", It shows me a Check mark, and the detail message is SOE_DEV_600 connection
    2. Edit the System object, then click on "Connection Tests" page, I see two lines there, check both and click test, here is the result:
    SAP Web AS Connection:
    : Test Details:
    The test consists of the following steps:
    1. Check the validity of the system ID in the system object
    2. Check if the system can be retrieved
    3. Check if the system object has a valid system alias
    4. Check if an SAP system is defined in the system object
    5. Validate the following parameters: Web AS protocol; Web AS hostname
    6. Checks if the host name of the server can be resolved.
    7. Pings the Web AS Ping service (works only if the service is activated on the Web AS, and only on ABAP Web AS)
    8. Check HTTP/S connectivity to the defined back-end application
    Results
    1. The system ID is valid
    2. System retrieved successfully
    3. Retrieval of the default alias was successful
    4. The system object represents an SAP system
    5. The following parameters are valid: ICM Protocol (http) ICM Host Name (soeprdev.mydomain.com:8000)
    6. The host name (soeprdev.mydomain.com) was resolved successfully
    7. The Web AS ping service http://soeprdev.mydomain.com:8000/sap/bc/ping was pinged successfully
    8. An HTTP/S connection to http://soeprdev.mydomain.com:8000 was obtained successfully.
    Connection Test for Connectors:
    : Test Details:
    The test consists of the following steps:
    1. Retrieve the default alias of the system
    2. Check the connection to the back-end application using the connector defined in this system object
    Results
    Default alias retrieved successfully
    Connection successful

  • Any Tutorial / Sample to create Single PDF from multiple source files using PDF assembler in a watched folder process.

    Any Tutorial / Sample to create Single PDF from multiple source files using PDF assembler in a watched folder process. I have a client application which will prepare number of source files and some meta data information (in .XML) which will be used in header/footer. Is it possible to put a run time generated DDX file in the watch folder and use it in Process. If possible how can I pass the file names in the DDX. Any sample Process will be very helpful.

    If possible, make use of Assembler API in your client application instead of doing this using watched folder. Here are the Assembler samples :  LiveCycle ES2.5 * Programming with LiveCycle ES2.5
    Watched folder can accept zip files (sample : Configuring a watched folder to handle multiple input files and write results to a single folder | Adobe LiveCycle Blog ). You can also use execute script to create the DDX at runtime : LiveCycle ES2 * Application Development Using LiveCycle Workbench ES2
    Thanks
    Wasil

  • Could not read from the source. Please check if it has moved or been deleted

    hi all,
    when i try to render a afx project from media encoder i get the following message
    "Could not read from the source. Please check if it has moved or been deleted"
    i've read on the forum that this is due to an old installation of Premiere.
    I'm using a brand new computer. the only software installed is my cs5 master collection and the updates from adobe website.
    win7 pro
    i7
    16 gig ram
    cheers
    s

    Hello, this is terrible problem, which i found in CS 6 softwares ...
    solution i found only working, is uninstall and reinstall full package.. but it is not all,
    you need to do BRAND NEW admin account in windows, and install it there.
    that means, i could not export after repair from encoder in my original account never more (!!)   .. this is really terrible way how to repair this issue, because :
    1.by reinstalling of software, client WASTE HIS TIME
    2.by necessity to begin work in another windows profile you again WASTE YOUR TIME because of learning and migrating all other profile modifications, which i see really unaccpetable. Adobe means, this solution of repair is ok, and they did not do till today any steps of creating some "clever" solution.
    I ask everybody, who will meet this issue in future, guys, please, complain about this situation, give "BUG Report" to them, and write "feature request" to them , in the way of creating some repair tool, which check actual  "broken" connections between encoder and premiere, which refuses to "take material" from it and encode, and REPAIR it automatically..  
       I am not IT, but ..does it seems so hard to create this ? Adobe IT developers should know their systems, and should create such utility tool really easy.
    Steps to reproduce bug:
    1. i export anything by button "queue" from premiere to Encoder
    2. Encoder will start encoding
    3. Encoder does not show the window of media encoding (down left )
    Results: sound of sheep occur,
    in encoding error file is this reason of canceling the encoding :
    01/02/2014 10:10:48 AM : Encoding Failed
    Could not read from the source. Please check if it has moved or been deleted.
    History of this problem and detailed description, HOW i did "repair" this. With wasting of app 2,5 days of my working time :
    1. after repairing "error 5" problem , i solved it by reinstalling the suite from the new admin user profile (profile B) . 
    I continued my work on my normal working windows profile . (profile A)
    Every cooperation (AE+Pr, export media via "queue" to Encoder) was working fine . . .
    2. suddenly it stop working (without knowing any possible reason - i did not do installations )
    and showed in error export log file :
    "Could not read from the source. Please check if it has moved or been deleted."
    3.repair via procedure(procedure "a"):
    i did this procedure on the profile B (profile from last time installation of repairing problem error 5)
    I did these steps :
    a-uninstall master coll suite
    b-i used Adobe cleaner tool (remove ALL)
    c-removed raw directories in locations
    •C:\Program Files\Adobe
    •C:\Program Files(x86)\Adobe
    •C:\Program Files\Common Files\Adobe
    •C:\Program Files(x86)\Common Files\Adobe
    •C:\ProgramData\Adobe
    d-removed these links from registry file
    •HKEY_LOCAL_MACHINE\SOFTWARE\Adobe
    •HKEY_CURRENT_USER\Software\Adobe
    •HKEY_LOCAL_MACH INE\SOFTWARE\Wow6432Node\Adobe
    •HKEY_CURRENT_USER \Software\Wow6432Node\Adobe
    e-restarted the PC
    f- newly installed the Master Coll CS6
    g-update the software
    result of repair of "3" : problem still exists
    4.Ok i find out after coordination with support, it should have been created  ANOTHER NEW admin account.
    4a:so i did the same procedure (uninstalling) in profile B
    4b: and then i created brand new admin profile (profile C)for INSTALLATION of software
    4c: restarted the pc (and did not updated it yet)
    result :
    ==exporting of any sequence/raw/AE link video material from premiere via "queue" (Encoder) (profile C) : export WORKS
    ==exporting of any sequence/raw/AE link video material from premiere via "queue" (Encoder) (profile B) : export WORKS
    ==exporting of any sequence/raw/AE link video material from premiere via "queue" (Encoder) (profile A) : export DOES NOT WORK ! ! !
    (in profile A, is possible to export some raw video material in encoder which is imported to it via "drag and drop)
    problem i see:, i have my basic profile A, which i am interested to work, because of all my directory modifications are in there..
    this issue should be some "broken" connections between encoder and premiere, which refuses to "take material" from it and encode.
    what i expect :
    to get from Adobe some repair tool, which automatically checks these connections and repair if necessary, without necessity of founding the new profile and reinstallation of whole software.. this is madness !
    what i do NOT expect from Adobe:
    to get from Adobe advice of kind : you have to reinstall full software in new admin profile. sorry , we do not know the solution, because we do not know, how do behave our software.

  • Can't open pdf from single source & 9.3 won't install

    I receive a weekly update from a source.  Until Jan.22 had no trouble
    accessing it.  Now when I click the link it goes to the link's site but no pdf appea
    rs, just a a normal site page
    w/o the acrobat background  I am on Vista platform,and IE8..  I can access this link from my laptop Also tried to upgrade from 9.0 to 9.3.  Download finished  but installation could not be completed. Said there were files open that were being used.  I had no other files open.  I am very frustrated and
    I am not experienced enough to know what to do.

    To re-install iPhoto
    1. Put the iPhoto.app in the trash (Drag it from your Applications Folder to the trash)
    2a: On 10.5:  Go to HD/Library/Receipts and remove any pkg file there with iPhoto in the name.
    2b: On 10.6: Those receipts may be found as follows:  In the Finder use the Go menu and select Go To Folder. In the resulting window type
    /var/db/receipts/
    2c: on 10.7 they're at
    /private/var/db/receipts
    A Finder Window will open at that location and you can remove the iPhoto pkg files.
    3. Re-install.
    If you purchased an iLife Disk, then iPhoto is on it.
    If iPhoto was installed on your Mac when you go it then it’s on the System Restore disks that came with your Mac. Insert the first one and opt to ‘Install Bundled Applications Only.
    If you purchased it on the App Store or have a Recent Mac you can find it in your Purchases List.

  • Generating multiple target xmls from one source xml using xslt mappings

    Hi,
    I need to create more than one xml file from one source xml file using xslt mappings in file to file scenario.
    Can you please let me know how this can be achieved.
    Thanks,
    Rajesh

    Rajesh,
    If you must use the XSL Transformation then you can find a nice simple example here.  It's based on the Xalan XSLT Processor which to my knowledge is incorporated in PI7.1.  I've not actually tried this but it makes for an interesting mapping case so please let us know the results: 
    [XSLT Split for multiple XML file output|http://abbeyworkshop.com/howto/xslt/xslt_split/index.html]
    The XSL file will require a namespace addition:
    <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:redirect="http://xml.apache.org/xalan/redirect" extension-element-prefixes="redirect" version="1.0">
    The redirect prefix is used for the write tags in the XSL file.
    The details cover the transformation of the source file:
    1:<student_list>
       2:    <student id="1">
       3:        <name>George Washington</name>
       4:        <major>Politics</major>
       5:        <phone>312-123-4567</phone>
       6:        <email>gw_at_example.edu</email>
       7:    </student>
       8:    <student id="2">
       9:        <name>Janet Jones</name>
      10:        <major>Undeclared</major>
      11:        <phone>311-122-2233</phone>
      12:        <email>janetj_at_example.edu</email>
      13:    </student>
      14:    <student id="3">
      15:        <name>Joe Taylor</name>
      16:        <major>Engineering</major>
      17:        <phone>211-111-2333</phone>
      18:        <email>joe_at_example.edu</email>
      19:    </student>
      20:</student_list>
    Using this transformation:
    2:<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    3:    xmlns:redirect="http://xml.apache.org/xalan/redirect"
    4:    extension-element-prefixes="redirect"
    5:    version="1.0"
    6:>
    7:<xsl:output method="xml"/>
    8:
    9:<xsl:template match="/">
    10:    <xsl:apply-templates />
    11:</xsl:template>
    12:
    13:<xsl:template match="student_list">
    14:    <xsl:apply-templates />
    15:</xsl:template>
    16:
    17:<xsl:template match="student">
    18:    <xsl:variable name="filename" select="concat(@id,'.xml')" />
    19:    <redirect:write select="$filename">
    20:        <student id="{@id}">
    21:            <xsl:apply-templates />
    22:        </student>
    23:    </redirect:write>
    24:</xsl:template>
    25:
    26:<xsl:template match="name | major | phone | email">
    27:    <xsl:copy-of select="." />
    28:</xsl:template>
    29:
    30:</xsl:stylesheet>

  • Multiple Output formats from Single Source File? (Like Squeeze)

    I'd really love to be able to batch process encodes in the following manner:
    Drop my source video file into AME CS5, select an MP4 preset, and then have it encode multiple bitrate versions while adding filename extensions [e.g. _High (700 kb/s), _Mid (550 kb/s), _Low (400 kb/s)].
    The simple answere is to drop, or duplicate, my input clip 3 times and just select 3 presets I could set up under the Hi, Mid, Lo parameters. But that's exactly what I'm trying to work around. I encode video ads for major web-video sites, and the volume is just manageable to batch process these (drag and drop large quantities, select multiple presets at once, hit Start.) To handle each ad we service would be far too time consuming, even for as simple as AME makes it to duplicate/choose new preset. (In the end, our ads jump onto our FTP via AME's FTP upload option, one of it's smartest features!)
    We have quite a bit of encoding resources here at work, but AME has been giving us the most favorable results. Other options, such as Sorenson Squeeze, let you import your source clips, then apply 2 or more presets to them, before encoding the whole batch. Is there any similar functionality in Media Encoder? (Really don't want to move our workflow into Squeeze, with it's inferior MP4 encoding.)
    Does anybody have any experience with this sort of high-volume multiple-outputs from individual source files? Any tips with scripts or Apple's "Automator" that could streamline this type of batch processing?

    Any update on this ability?  We create many in house videos that need to be encoded to 14 different
    bitrates for use with Flash Media Server as dynamic http streams.
    Currently when I am ready to export a finished sequence, I will pick my first preset and queue it in AME.  Then I duplicate that thirteen times, setting each of the new thirteen queued items to their appropriate bitrates.  Then I have to change each of the output names to be "filename_bitrate.flv".
    This process is much slower when queued in AME than if I exported each individually from PP.  I just don't have the time to manually export each version.
    I have also started noticing that some of the last few projects won't render beyond the quality of the first queued item.  Do I need to render the largest bitrate file first?
    Any indication from Adobe on the correct workflow to create multiple bitrate files to be consumed by FMS as dynamic http streams would be appreciated.
    The link above is dead.  Does anyone have an updated link to the document above?

  • Referring to Automator Results from Previous Steps

    Is it possible to refer to the results from previous steps in an Automator Workflow? Specifically, I'm trying to modify my backup Workflow to remove files from a folder before archiving it. I'm writing code and I want to back up my progress every so often. I have a decent workflow for archiving my entire project folder, but my executable is 21 MB where as the source code is .5 MB. I could cut the size of my archive dramatically if I could remove the executable file before archiving (plus it's obviously nonportable and redundant anyhow). I'm running into 2 issues when trying to create a Workflow to handle this:
    1) Find/Filter Finder Items rely on Spotlight meta tags. Since I'm constantly recreating the executable, it appears Spotlight generally doesn't have the executable indexed so these two functions don't work. They simply don't see the executable.
    2) Even if I find the file and delete it, I don't know how to select the folder again to archive it. I'm trying to keep this general so it'll work on any project folder so I don't want to point it directly to the folder.

    Is it possible to refer to the results from previous steps in an Automator Workflow? Specifically, I'm trying to modify my backup Workflow to remove files from a folder before archiving it. I'm writing code and I want to back up my progress every so often. I have a decent workflow for archiving my entire project folder, but my executable is 21 MB where as the source code is .5 MB. I could cut the size of my archive dramatically if I could remove the executable file before archiving (plus it's obviously nonportable and redundant anyhow). I'm running into 2 issues when trying to create a Workflow to handle this:
    1) Find/Filter Finder Items rely on Spotlight meta tags. Since I'm constantly recreating the executable, it appears Spotlight generally doesn't have the executable indexed so these two functions don't work. They simply don't see the executable.
    2) Even if I find the file and delete it, I don't know how to select the folder again to archive it. I'm trying to keep this general so it'll work on any project folder so I don't want to point it directly to the folder.

  • Return results from ADEP in AS3 Arrays, not ArrayCollections

    Hi all,
    Is there any way to force ADEP to return results from data services in simple AS3 Arrays, not ArrayCollections? Here are my situation:
    In my project I use ADEP Data Management Services. To connect to ADEP services we use RTMP channel defined in services-config.xml:
    <?xml version="1.0" encoding="UTF-8"?>
    <services-config>
         <services>
              <service-include file-path="remoting-config.xml" />
              <service-include file-path="proxy-config.xml" />
              <service-include file-path="messaging-config.xml" />
              <service-include file-path="data-management-config.xml" />
              <service-include file-path="managed-remoting-config.xml" />
              <service class="fiber.data.services.ModelDeploymentService" id="model-deploy-service" />
              <default-channels>
                   <channel ref="my-rtmp"/>
              </default-channels>
         </services>
         <channel-definition id="my-rtmp" class="mx.messaging.channels.RTMPChannel">
              <endpoint url="rtmp://{server.name}:1000" class="flex.messaging.endpoints.RTMPEndpoint"/>
              <properties>
                   <idle-timeout-minutes>20</idle-timeout-minutes>
                   <block-rtmpt-polling-clients>true</block-rtmpt-polling-clients>
                   <rtmpt-poll-wait-millis-on-client>0</rtmpt-poll-wait-millis-on-client>
              </properties>
         </channel-definition>
    </services-config>
    To manage data in database we defined data services in data-management-config.xml like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <service id="data-service" class="flex.data.DataService">
        <adapters>
            <adapter-definition id="java-dao" class="flex.data.adapters.JavaAdapter"/>
            <adapter-definition id="mr-java-dao" class="flex.data.adapters.ManagedRemotingDataAdapter" />
            <adapter-definition id="actionscript" class="flex.data.adapters.ASObjectAdapter" default="true"/>
        </adapters>
        <default-channels>
            <channel ref="my-rtmp"/>
        </default-channels>
        <destination id="SomeProduct.SomeItems">
            <adapter ref="java-dao" />
            <properties>
                <source>flex.data.assemblers.SQLAssembler</source>
                <scope>application</scope>
                <metadata>
                    <identity property="ID" />
                </metadata>
                <server>
                    <database>
                        <datasource>java:comp/env/jdbc/SP</datasource>
                    </database>
                    <actionscript-class>com.somecompany.classes.SomeCoolClass</actionscript-class>
                    <create-item>
                        <procedure name="SomeItems_Insert">
                            <procedure-param property-value="#Session_ID#" />
                            <procedure-param property-value="#Division_ID#" />
                            <procedure-param property-value="#Salesrep_ID#" />
                            <procedure-param property-value="#Area_Code#" />
                            <procedure-param property-value="#Item_ID#" />
                        </procedure>
                        <id-query>SELECT IDENT_CURRENT('Work_Area_Item')</id-query>
                    </create-item>
                    <fill>
                        <name>all</name>
                        <procedure name="SomeItems_Get">
                            <procedure-param property-value="#Session_ID#" />
                            <procedure-param property-value="#Last_Sync_Time#" />
                        </procedure>
                    </fill>
                    <update-item>
                        <procedure name="SomeItems_Update">
                            <procedure-param property-value="#Session_ID#" />
                            <procedure-param property-value="#Division_ID#" />
                            <procedure-param property-value="#Salesrep_ID#" />
                            <procedure-param property-value="#Area_Code#" />
                            <procedure-param property-value="#Item_ID#" />
                       </procedure>
                    </update-item>
                    <delete-item>
                        <procedure name="SomeItems_Delete">
                            <procedure-param property-value="#Session_ID#" />
                            <procedure-param property-value="#Item_ID#" />
                        </procedure>
                    </delete-item>
                </server>
            </properties>
        </destination>
    </service>
    By default, ADEP returns results from SomeProduct.SomeItems destination to Flex side as ArrayCollection of SomeCoolClass instances but I need the data to be returned in simple AS3 Arrays. Recently, I found that there is small optional serialization configuration in channel-definition that should resolve my problem. So I updated my channel-definition in services-config.xml to this:
    <channel-definition id="my-rtmp" class="mx.messaging.channels.RTMPChannel">
         <endpoint url="rtmp://{server.name}:1000" class="flex.messaging.endpoints.RTMPEndpoint"/>
         <properties>
              <serialization>
                   <legacy-collection>true</legacy-collection>
              </serialization>
              <idle-timeout-minutes>20</idle-timeout-minutes>
              <block-rtmpt-polling-clients>true</block-rtmpt-polling-clients>
              <rtmpt-poll-wait-millis-on-client>0</rtmpt-poll-wait-millis-on-client>
         </properties>
    </channel-definition>
    However, result are still returned in ArrayCollections.
    Any ideas?
    Thanks in advance

    Thom Parker answered this here: http://forums.adobe.com/message/2614570#2614570
    Answer copied below:
    "The problem is that when the focus is on the text box
    it's in edit mode. It's only displaying the value interactively entered by
    the user, or as a consequence of the change event.  What you need to do is
    force the focus off of the text box in code.  You can do a little trick
    where you bounce it to a tiny transparent field, which then bounds the focus
    back so it doesn't look like the focus changed."
    What I ended up doing was calling up the dialog box, then using setfocus with no parameters to remove focus from the field, as follows:
    this.rawValue = this.dialogBoxFunction(this.rawValue); // passing current value so dialog box defaults to that value
    xfa.host.setFocus();
    Cheers,
    Marty.

  • Saving result from query into CSV file

    Hi folks,
    in our application we're generating pages source using general packages (like htp, owa_util, ...). and in this part I'm a really beginner.
    I want to modify source one of our page, I want to add functionality to enable save result from query (cursor) into CSV file, to enable user choose place where generated file will be created and also change file name.
    I searched this forum and I found procedure, that could be useful:
    procedure p_getcsv is
    cursor cur is
           select 'a1' col1, 'b1' col2, 'c1' col3 from dual
       union  select 'a2' col1, 'b2' col2, 'c2' col3 from dual
       union  select 'a3' col1, 'b3' col2, 'c3' col3 from dual;
       begin
            -- Set the MIME type
            owa_util.mime_header( 'application/octet', FALSE );
            -- Set the name of the file
            htp.p('Content-Disposition: attachment; filename="test.csv"');
            -- Close the HTTP Header
            owa_util.http_header_close;
            -- Loop through all rows in EMP
            for x in cur
            loop
                -- Print out a portion of a row,
                -- separated by commas and ended by a CR
                 htp.prn(x.col1||','|| x.col2||','||x.col3|| chr(13));
            end loop;        
       end;What peace of code should I add to procedure that is generating web page to enable calling this procedure and whole saving process?
    Can anybody help me with this?
    Many thanks,
    Tomas
    Message was edited by:
    Tomeo

    Hi Marc,
    thanks for reply, problem is that I'm not using APEX application, I'm just generating web page code straight using oracle general packages.
    But I found this solution (maybe some tunning will be good):
    In page where I want to display Download i have
      begin
             htp.anchor2 (
                           curl  =>  ... .p_getcsv'||'?term=2005&crn=123,
                           ctext => 'Download Class List'
             HTP.br;
          end;
    ...si I'm calling p_getcsv procedure:
      procedure p_getcsv( term  IN stvterm.stvterm_code%TYPE DEFAULT NULL,
                           crn   IN sirasgn.sirasgn_crn%TYPE DEFAULT NULL) is
       v_length      NUMBER;
       v_file_name   VARCHAR2 (2000);
       temp_blob  blob;
       line RAW(32767);
       begin
             DBMS_LOB.CREATETEMPORARY(temp_blob, TRUE);
             FOR i IN 1..6  LOOP
                line := UTL_RAW.CAST_TO_RAW(i||','||term||','||crn||',AAA,BBB,CCC'||chr(10));
                DBMS_LOB.WRITEAPPEND(temp_blob, LENGTH(UTL_RAW.CAST_TO_VARCHAR2(line)), line);
             END LOOP;
              v_file_name := 'ClassList.csv';
              v_length  := DBMS_LOB.getlength (temp_blob);
              -- set up HTTP header
                 -- use an NVL around the mime type and
                 -- if it is a null set it to application/octect
                 -- application/octect may launch a download window from windows
               OWA_UTIL.mime_header (NVL ('csv', 'application/octet'), FALSE);
               -- set the size so the browser knows how much to download
               HTP.p ('Content-length: ' || v_length);
               -- the filename will be used by the browser if the users does a save as
               HTP.p (   'Content-Disposition:  attachment; filename="'
                  || REPLACE (REPLACE (SUBSTR (v_file_name,
                                               INSTR (v_file_name, '/') + 1
                                       CHR (10),
                                       NULL
                              CHR (13),
                              NULL
                  || '"'
                 -- close the headers
                 OWA_UTIL.http_header_close;
                -- download the BLOB
                 WPG_DOCLOAD.download_file (temp_blob);
                 -- release temporary blob
                 dbms_lob.freetemporary(temp_blob);  
       end;Regards,
    Tomas

  • Copy Transferrules from one source system to another

    Hi everybody,
    we're uploading data from nearly 25 sourcesystems (including separate systems and systems with more than one client to extract from) via flat-file. The source-system exports data to a shared file directory and the bw-system reads it from there. Though this results in only <b>one</b> transferrule for all source-systems, it keeps generating problems every time upload is due.
    Somebody had the idea to switch to 'standard' R/3->BW method of data staging, using generic extractors and so on. But this will result in <b>25</b> transferrules which, I'm afraid, have to be set up and maintained seperately. As there's quite some coding in the rules (the transfertructure in fact has 241 fields), flaws are for sure!
    We know the trick of exporting a transferrule, changing the source-system-mapping and re-importing the same transport-request. That can be used to copy to one or two other systems, but not for 25.
    Does anybody know another way to copy a transferrule from one source system to multiple other source systems?
    Looking forward to your answers
    Thanks and Regards
    Robert

    Hi Ralph,
    I don't like that methode. Importing the same transport 15 times, each time changing the source-system mapping and making sure, that source-system gets the changes seems to be a huge source of errors, which will be hard to find.
    What do you think of the following idea to solve the issue a totally different way:
    All transfer-rules directly connected to the source-systems are stripped of all Coding, so there is a plain 1 by 1 transformation including 0logsys.
    The InfoSources attached all lead to one ODS-Object, which has the only purpose to collect the data from the source-system without any transformation. So some fields a filled by a system and other fields maybe not.
    The coding specific for source-systems is moved to transfer- / update-rules from that ODS-Object to the original ODS-Objects / Cubes and the "collecting" ODS-Object is emptied after every loading process.
    This will result in a single point of change. It will be quite a huge program (maybe in startroutine) but it's the one and only point, where changes and corrections have to be made.
    Regards
    Robert

  • What is the best alternative to exclude results from a specific site collection

    Hi,
    We want to exclude results from a specific site collection (for example http://server/sites/demo").
    A few ways are being considered:
    1. Copy the Local SharePoint Results Result Source and edit it not to include results from that site collection and then set the new custom result source as default.
    2. Edit the Search Results WP in the "Everything" tab to exclude those results (edit the query there)
    3. Set a query rule on "Local SharePoint Results" to exclude the results from the Demo Site Collection
    What is the best?
    keren tsur

    Hi Keren,
    Modifying the Local result  xlt will give you the most granularity but it is also arguably more challenging because of the "editing code" portion
    On your search page you could append a query string via the search webpart but that may not get granular enough.
    In the past I have used search scopes to exclude items in search results. Depending on the specific requirement it may be a quick and easy way to accomplish your task. it really depends on how ganular you need to get with excluding items from results.
    A search scope can be setup be going to:
    1. Site Actions (of the site collection you want to apply this to)
    2. Click on Site Settings
    3. Under site Collection Administration ( you have to be a site collection admin or farm admin to be able to see this section)
    4. click on "Search Scopes"
    5. Click on New Scope
    6. Name it, fill out any other options you want and click "ok"
    7. Click on your newly created search scope.
    8. click on "new rule"
    9. Fill out the fields based on your requirements and then click "OK"
    10. I believe you will want to wait for a search to index to run before this scope will return any results.
    Please let me know if you have any other questions.
    Good Luck!
    Alex

Maybe you are looking for