PCD search for delta links

Hi all,
I'm trying to look for delta link pcd objects in my portal.
I have code that works for iviews, pages, roles & worksets,  for example:
NamingEnumeration ne = dirCtx.search("","(com.sap.portal.pcd.gl.ObjectClass=com.sapportals.portal.iview)",
pcdSearchControls);
The following example shows how to read ONE delta link properties::
http://help.sap.com/saphelp_nw04/helpdata/en/44/703fd42043053ce10000000a1553f6/content.htm
I want to run this code for all delta links in the portal.
How to look for ALL delta links by code?
I'm using 7.00.17.
Regards,
Omri

Hi
use our coding to fetch iViews/pages/...
got through them and check wether one is a delta link or not:
If the object is not a delta link or inherited via a delta link, getDlModificationState() returns null.
regards
Johannes

Similar Messages

  • Search for Missing Links in This Folder option not working

    I'm running InDesign CC 9.2.1, though this problem has persisted throughout the last few years and previous versions of InDesign. My operating system is OS 10.9.3. In early versions of InDesign, you could check the "Search for Missing Links in This Folder" option box when re-linking a missing link and the missing image would be highlighted in the finder window of the new image location. I did not keep track of what version of InDesign it was when this function ceased to work properly, but I would like to see if there is something I can do to make it work again.
    The "Search for Missing Links in This Folder" check box is still in the finder window, and is always checked, so in theory it should highlight the missing link/s in the folder it is searching in, but it does not. The example below illustrates my point. The InDesign document that I used to illustrate this was opened up and there were four missing links. When I select the missing link/s and hit the relink button at the bottom of the links palate, I navigate to the correct location of the link but the image doesn't get selected even though the "Search for Missing Links" is checked. The missing link it should have highlighted was 431-11 frontal.psd, which clearly exists at the top of the "Front Street" folder.
    This isn't a big deal when the folder contains four images. But when it has hundreds of links, it is a pain to try to find all of the links that are missing when the file name is often truncated, or it has an odd string of only numbers, etc. Just checking to see if anyone has any insight to this issue.

    I think the yellow alert triangle next to each link name means that you just have to update those links, by hitting the refresh button at the bottom of the link panel once all links are selected (it's two arrows in a circle)

  • Search for broken links to images

    Is there an efficient way to search for broken links to
    images?

    Hi there
    I'm not sure if you are meaning to ask about links inside
    topics that point to images that you are popping up or linking to
    outright or if instead, you mean that some of the images that are
    supposed to be displayed inside topics are maybe missing and you
    wish to know about them.
    One easy way to spot a missing image reference is by using
    the Project Manager. Not sure if you are using RoboHelp 7, 8 or
    older here, but if on 7 or 8 look in the Project Manager pod.
    Otherwise it will be found on the left panel of the application.
    Expand the Images pseudo folder and quickly scan the list of images
    and expand sub folders inside. Any missing image should be listed
    with a red x to indicate the status. You may then examine the image
    properties to see which topic(s) is/are still referencing it.
    Cheers... Rick

  • "Search for missing links in this folder" doesn´t work anymore

    Just upgraded to CS6 for Mac on Snow Leopard and the Search for Missing links function no longer works.  Does anyone have any idea of how to fix it?

    Is it set in your prefs? It's optional....

  • "Search for Jobs" link not working in E-Recruitment

    Hi,
    Whenever Employee/External Candidate clicks on the link "Search for Jobs" on the start page, list of jobs are not getting displayed despite after data is available.

    Hi ,
      Go to SPRO-personnel management-Employee self services-homepage framework - resources-edit resources
    select the resource for Search for job .... and check if the URL path is given.
    Thanks
    Sunitha

  • Trex Search for Archive Link OAAD or OAWD Documents,

    HI Sap Gurus,
    Can Trex Search in the originals of documents which are stored by OAAD or OAWD Transactions,
    if so how to activate the Trex Original document Search for Documents stored through OAAD or OAWD,
    Thanks and regards
    J Sari,

    Dear Athol,
    We are using SAP Content server to store our Documents from GOS and also the same Content server is used  for maintaining our Documents through DMS also,
    so as a standard, Trex cannot search documents maintained in content server by GOS methord and it can only search those documents which are maintained by DMS CV01N method,
    pls suggest how to proceed for the development to allow Trex to search all the documents in Content server, how big is the development, is there any SAP suggested solution to be followed,
    Thanks and Regards
    J sari

  • Searching for jDeveloper link

    Hi,
    All,
    I am new to the OAF and please can any one send me the link where i can download the link for this OAF development kit.
    Thanks
    Surya

    Hi, Gyan,
    I have dropped a mail to above id and please do the needful on my request.
    Thanks
    Surya

  • BI reports not getting opened using remote delta link in local pcd.

    Hi All,
    I have connected BI to portal and the BI reports are working fine in BI but when i copy the reports in the local pcd using remote delta links it is showing the error "500 internal server error" and no esid found.
    Thanks & Regards
    Mary.

    Hi,
    Could you try to set the log level ALL for com.sap.portal.fpn and its subtree (From log configurator service in Visual Administration)?
    Then see the log messages in the default.trc logs and possibly post them to this threads.
    Regards
    Dagfinn

  • Searching for links across multiple pdf files

    We have thousands of pdf files that are being moved to a new website. Some of these pdf files have links within them (either as text or as a hyperlink). This number is unknown.
    The issue is how to programmatically search across multiple pdf files (numbering in the thousands) looking for links using a regular expression or part of a path. This will have to be able to search behind the text and search for the link url.
    We first need to identify the number of files with links and create a list of the files with links that need modifying. If the number is too great to modify manually, then we would need the ability to programmatically edit these links.
    The pdf files are stored in a database. Also, the pdf files are different versions and some are password protected.
    Is there an Adobe product that will perform this? If not, are there any 3rd party vendor products that will accomplish this?
    Thanks in advance for your help.

    I have no solution, but a thought: the database factor may seem to be
    a killer. But you could look for a solution designed to read PDF files
    from a web site (by spidering or from a list), which would presumably
    load them.
    Or could do a one off extraction of the files from the database into a
    directory and use that for your process. Probably a very good idea,
    since extracting all files from the database is likely to be costly
    and hammer the server (but can be scheduled at a sensible pace), while
    the search process will (if it is possible at all) doubtless need to
    be run countless times.
    Aandi Inston

  • Bash script for checking link status

    So I'm doing some SEO work I've been tasked with checking a couple hundred thousand back links for quality.  I found myself spending a lot of time navigating to sites that no longer existed.  I figured I would make a bash script that checks if the links are active first.  The problem is my script is slower than evolution.  I'm no bash guru or anything so I figured maybe I would see if there are some optimizations you folks can think of.  Here is what I am working with:
    #!/bin/bash
    while read line
    do
    #pull page source and grep for domain
    curl -s "$line" | grep "example.com"
    if [[ $? -eq 0 ]]
    then
    echo \"$line\",\"link active\" >> csv.csv else
    echo \"$line\",\"REMOVED\" >> csv.csv
    fi
    done < <(cat links.txt)
    Can you guys think of another way of doing this that might be quicker?  I realize the bottleneck is curl (as well as the speed of the remote server/dns servers) and that there isn't really a way around that.  Is there another tool or technique I could use within my script to speed up this process?
    I will still have to go through the active links one by one and analyze by hand but I don't think there is a good way of doing this programmatically that wouldn't consume more time than doing it by hand (if it's even possible).
    Thanks

    I know it's been awhile but I've found myself in need of this yet again.  I've modified Xyne's script a little to work a little more consistently with my data.  The problem I'm running into now is that urllib doesn't accept IRIs.  No surprise there I suppose as it's urllib and not irilib.  Does anyone know of any libraries that will convert an IRI to a URI?  Here is the code I am working with:
    #!/usr/bin/env python3
    from threading import Thread
    from queue import Queue
    from urllib.request import Request, urlopen
    from urllib.error import URLError
    import csv
    import sys
    import argparse
    parser = argparse.ArgumentParser(description='Check list of URLs for existence of link in html')
    parser.add_argument('-d','--domain', help='The domain you would like to search for a link to', required=True)
    parser.add_argument('-i','--input', help='Text file with list of URLs to check', required=True)
    parser.add_argument('-o','--output', help='Named of csv to output results to', required=True)
    parser.add_argument('-v','--verbose', help='Display URLs and statuses in the terminal', required=False, action='store_true')
    ARGS = vars(parser.parse_args())
    INFILE = ARGS['input']
    OUTFILE = ARGS['output']
    DOMAIN = ARGS['domain']
    REMOVED = 'REMOVED'
    EXISTS = 'EXISTS'
    NUMBER_OF_WORKERS = 50
    #Workers
    def worker(input_queue, output_queue):
    while True:
    url = input_queue.get()
    if url is None:
    input_queue.task_done()
    input_queue.put(None)
    break
    request = Request(url)
    try:
    response = urlopen(request)
    html = str(response.read())
    if DOMAIN in html:
    status = EXISTS
    else:
    status = REMOVED
    except URLError:
    status = REMOVED
    output_queue.put((url, status))
    input_queue.task_done()
    #Queues
    input_queue = Queue()
    output_queue = Queue()
    #Create threads
    for i in range(NUMBER_OF_WORKERS):
    t = Thread(target=worker, args=(input_queue, output_queue))
    t.daemon = True
    t.start()
    #Populate input queue
    number_of_urls = 0
    with open(INFILE, 'r') as f:
    for line in f:
    input_queue.put(line.strip())
    number_of_urls += 1
    input_queue.put(None)
    #Write URL and Status to csv file
    with open(OUTFILE, 'a') as f:
    c = csv.writer(f, delimiter=',', quotechar='"')
    for i in range(number_of_urls):
    url, status = output_queue.get()
    if ARGS['verbose']:
    print('{}\n {}'.format(url, status))
    c.writerow((url, status))
    output_queue.task_done()
    input_queue.get()
    input_queue.task_done()
    input_queue.join()
    output_queue.join()
    The problem seems to be when I use urlopen
    response = urlopen(request)
    with a URL like http://www.rafo.co.il/בר-פאלי-p1
    urlopen fails with an error like this:
    Exception in thread Thread-19:
    Traceback (most recent call last):
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/threading.py", line 639, in _bootstrap_inner
    self.run()
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/threading.py", line 596, in run
    self._target(*self._args, **self._kwargs)
    File "./linkcheck.py", line 35, in worker
    response = urlopen(request)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 160, in urlopen
    return opener.open(url, data, timeout)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 473, in open
    response = self._open(req, data)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 491, in _open
    '_open', req)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 451, in _call_chain
    result = func(*args)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 1272, in http_open
    return self.do_open(http.client.HTTPConnection, req)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/urllib/request.py", line 1252, in do_open
    h.request(req.get_method(), req.selector, req.data, headers)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/http/client.py", line 1049, in request
    self._send_request(method, url, body, headers)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/http/client.py", line 1077, in _send_request
    self.putrequest(method, url, **skips)
    File "/usr/local/Cellar/python3/3.3.0/Frameworks/Python.framework/Versions/3.3/lib/python3.3/http/client.py", line 941, in putrequest
    self._output(request.encode('ascii'))
    UnicodeEncodeError: 'ascii' codec can't encode characters in position 5-8: ordinal not in range(128)
    I'm not too familiar with how character encoding works so I'm not sure where to start.  What would be a quick and dirty way (if one exists) to get URLs like this to play nicely with python's urlopen?

  • Code to search for URL

    Hi all im using code to load in text created in notepad which
    is
    loadVariablesNum("j:\\Uni
    work\\Resume_Website\\loads\\SiteFiles\\TextFiles\\Linker.txt", 0);
    but the finished product will have to be tranported to
    another computer using probably a hard drive or DVD and obviously
    when I transport the Flash file any .swf files or html published
    will try to load the text file "Linker.txt" using the code above
    but wont be succesfull because the .txt files' URL would have
    changed.
    How can I change the code so that-- loadVariablesNum --will
    search for the Linker.txt where ever it is. Or is there alternative
    to the method im using which would be more effective

    Just to explain how to use relative paths. Say you have your
    flash project in ABC directory. And in that same directory, you
    have your Linker.txt. In Flash, ou would only need "Linker.txt".
    However, if the Linker.txt file was within a folder (DEF) located
    within ABC, you would use: "DEF/Linker.txt". Basically, you are
    giving the path from the .swf. And if I recall correctly, you can
    use /../ to look in the above direcory.

  • When i search for a website it takes me to another website.  I will click on the link and it takes me to one ive never wanted?  please help very frustrating!  I am not very computer savy, is it my settings or something?  Jen

    When I search for a website it itakes me to another website.  I will click on a link and it takes me to something completely different or yellow pages etc. Please help I am very frustrated and not very computer savy!  Thanks!  Jen

    Hi Jeff I have uninstalled Muse from my applications and have tried to download the new version but it displays a message 'file not found'. I have included a screenshot.

  • Sharepoint 2013 - Search for link names on a single page

    Hi
    I am messing around with SP13 and trying to get my search result web part to only search for link names and display all results.
    I have created the search source to point at the sub site. If I search for part of a link name it will list all links in one long row as a single result.
    Example: One single page contains all links. Each link points to a .aspx page. The search source is pointing at this page and it can display results but only as a single result containing all links as one line of text.
    Say I have 1000+ links of which 50 has names such as Link1-Item1-Basket1, Link1-Item1-Basket2 up to Link1-Item1-Basket50.
    When I search for Link1-Item1 I would like my search results to display 50 results as 50 individual links.
    Is this possible?

    Hi,
    Yes, you can format the search results as you expected.  But you have to create a custom display template to format your results such as extract the links from each results and display it as separate link.
    Once display template created you have to map the display template to your search results web part.
    Please refer to the following articles.
    SharePoint 2013 Customize Display Template for Content By Search Web
    Part (CSWP) Part-1
    Introduction to SharePoint 2013 Display Templates
    Display Template Overview
    Please mark it answered, if your problem resolved.

  • Skipping the search for linked images

    Hi all
    My indesign document have 250 images. While I have moved the indesign document to different server. All the images are missing links now. While I am opening the document indesign looking for image links, this process is cosuming more time to open the document.
    I want to just skip the searching for linked file and open the file. I will relink them afterwards.
    Skipping the search for linked images while opening the indesign file is prossible?
    Thanks in advance
    Regards
    arul

    That preference can be changed. Edit>preferences>file linking (Windows) or InDesign>preferences>file linking (Mac)
    Bob

  • Searching for Links in km Folder

    Hi Gurus,
    I tried to search for links in km folder. But the TREX didn't find them.
    is it possible to configure the TREX to enable links to be found?. if
    yes how does it work?
    Thank you

    Hi,
    there is a custom property you have to add during index creation called
    <i>IndexInternalLinks</i> and set the value to <i>true</i>
    In this way the link itself is indexed, not the target.
    Have a look there also:
    http://help.sap.com/saphelp_nw70/helpdata/en/73/66c090acf611d5993700508b6b8b11/frameset.htm
    Regards,
    Sascha

Maybe you are looking for