Do we have any agent/web crawler

Hi
Do we have any agent/ web crawler kind of stuff, which runs periodically and sends the result to the user.
Thanx
hari

Ultra Search is an application on top of Oracle Text that has a crawler.
More information at: http://otn.oracle.com/products/ultrasearch/content.html

Similar Messages

  • Do you have any html5 web app examples (not using javascript)?

    Do you have an html5 web app that only uses html5 and CSS?  (and NOT javascript).  I'd just like to see what other people are making because I don't understand how you could make an interactive webapp without using javascript (and I don't want to learn javascript).  I'd like to see what a web app looks like.

    HTML on it's own can't do anything smart. You need JavaScript to know when someone has interacted with the page. For example, clicked on an image, or entered text into a form field. You're going to need to know JavaScript in order to create any kind of apps with HTML 5 and CSS. If you want any interactive things such as geolocation etc it requires JavaScript. It's not hard to learn. I would suggest learning a library like JQuery. It uses CSS ids to allow you to select elements and regions on a page so as you can manipulate them. It's based on JavaScript, but a bit easier to learn. It's easy to add things like loops and chain functions it looks very CSS like so if you come from a design background it might be better for you to learn. You simply download JQuery from the http://jquery.com and store it in a folder with the site and then do a link to it. It's straight forward.

  • Web pages are very slow to load airport is fine and other users do not have any problems

    my macbook pro is very slow to load web pages airport is running fine and other users do not have any issues?

    Likely a corrupted cache, do #12 and #13 here, if that doesn't work run through the Steps 1+
    Step by Step to fix your Mac
    Why is my computer slow?

  • I downloaded the web developer tool bar and after using it my images are gone. I use google and a home page and the big google is gone. I uninstalled firefox and reinstalled it but I still don't have any images. How can I get them back?

    Two days ago I downloaded the web developer tool bar (aus2.mozilla.com). I tried to use it but gave up. I deleted the tool bar.
    I don't have any images anymore. I use Google.com as my home page (default) and the large GOOGLE isn't there anymore. I've also noticed some of the sites I use don't have the continue or next or enter keys anymore. It seems the only thing I have now is the text.

    Are you sure you are looking at the Google page?? In Firefox 4.0 the default homepage / Start Page in Firefox was changed to '''about:home''' which looks a lot like the former www.google.com/firefox homepage.

  • I have one problem and I think it is a settings issue. When I google the web I get results. Now, when I visit some website and I return to my search results I don't have any markers (diffrent link colour for visited sites in IE).

    Question
    I have one problem and I think it is a settings issue. When I google the web I get results. Now, I visited some website and then I want to return to my search results and pick another search result. And there is a problem: I don't have any markers (diffrent link colour for visited sites in IE) that would help me to distinguish visited and not visited sites.

    Why start a new and very similar thread to your other one which you have not responded to (have you read the replies?)
    I suggest that no response is made to this duplicate thread. 

  • I am getting a new computer and it will not have any web browser software preloaded. Can I get a CD with Firefox in order to install?

    I have a new computer that will not have any web browser installed when I acquire it. So, I cannot download Firefox, say, using Internet Explorer. Can I get a copy of Firefox on a CD and use that instead.

    Easiest is to download the full Firefox installer and copy that file to an USB stick.
    http://www.mozilla.com/firefox/all.html
    Such an USB stick is also useful for making backups of your profile data, so if you do not already have one then this is an opportunity to start making backups.
    http://kb.mozillazine.org/Profile_backup and [[Backing up your information]]
    http://kb.mozillazine.org/Backing_up_and_restoring_bookmarks_-_Firefox
    http://kb.mozillazine.org/Password_Manager#Backing_up_and_restoring_passwords

  • Does OC4J have any web console for an administrator?

    Does OC4J have any web console for an administrator?
    If it does, on which URL is it?

    For versions earlier than 9.0.4, we do not have a console for OC4J standalone.
    What version you are using ? For 10.0.3 Developer preview we have one for developers, try http://hostname:8888/adminoc4j/
    Our production release will have a nice management console.
    -Debu

  • Can  any   one have cti and web clinet  integraion knowledge

    Hi all,
    Can anyone have CTI and web  cleint integration knowldge.
    and points will  be revarded,pls mail me :[email protected]
    thanks,
    regards,
    madhukarreddy

    Hi
    I have send the relevent document for CTI integration with ic webclient at [email protected]
    check it
    please do reward points if helpful
    Dinaker vikas

  • Error while generating Static web pages hierarchy for Web Crawler searching

    Hi All,
    When we are trying to generate Static web pages for Web Crawler searching for our B2C application it is resulting in an error :
    u201CUnable to initiate generation of static pages; check logsu201C
    After tracing the log files we found out the below detailed information:
    application [catalogtool] Error in method getResourceAsStream(WEB-INF/cfg/catalog-site-config.xml). The error is: com.sap.engine.services.servlets_jsp.server.exceptions.WebMalformedURLException: A resource path must begin with [/]. The error occurred in [].
    at com.sap.engine.services.servlets_jsp.server.runtime.context.ServletContextImpl.getResource(ServletContextImpl.java:452)
    at com.sap.engine.services.servlets_jsp.server.runtime.context.ServletContextImpl.getResourceAsStream(ServletContextImpl.java:481)
    at com.sap.isa.core.ActionServlet.getResourceAsStream(ActionServlet.java:297)
    at com.sap.isa.catalog.impl.CatalogSite.loadConfigFile(CatalogSite.java:666)
    at com.sap.isa.catalog.impl.CatalogSite.initBackendObject(CatalogSite.java:209)
    at com.sap.isa.core.eai.BackendObjectManagerImpl.createBackendBusinessObject(BackendObjectManagerImpl.java:190)
    at com.sap.isa.catalog.CatalogFactory.getCatalogSite(CatalogFactory.java:261)
    at com.sap.isa.catalog.webcatalog.WebCatInfo.init(WebCatInfo.java:237)
    at com.sap.isa.businessobject.webcatalog.CatalogBusinessObjectManager.createCatalog(CatalogBusinessObjectManager.java:103)
    at com.sap.isa.catadmin.actions.WebCrawlerAction.doPerform(WebCrawlerAction.java:93)
    at com.sap.isa.core.BaseAction.execute(BaseAction.java:212)
    at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:484)
    at com.sap.isa.core.RequestProcessor.processActionPerform(RequestProcessor.java:683)
    at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:274)
    at com.sap.isa.core.RequestProcessor.process(RequestProcessor.java:400)
    at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1482)
    at com.sap.isa.core.ActionServlet.process(ActionServlet.java:243)
    at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:525)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at com.sap.engine.services.servlets_jsp.server.runtime.RequestDispatcherImpl.doWork(RequestDispatcherImpl.java:321)
    at com.sap.engine.services.servlets_jsp.server.runtime.RequestDispatcherImpl.forward(RequestDispatcherImpl.java:377)
    at org.apache.struts.action.RequestProcessor.doForward(RequestProcessor.java:1069)
    at org.apache.struts.action.RequestProcessor.processForwardConfig(RequestProcessor.java:455)
    at com.sap.isa.core.RequestProcessor.processForwardConfig(RequestProcessor.java:276)
    at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:279)
    at com.sap.isa.core.RequestProcessor.process(RequestProcessor.java:400)
    at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1482)
    at com.sap.isa.core.ActionServlet.process(ActionServlet.java:243)
    at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:525)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
    at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:401)
    at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:266)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:387)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:365)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:944)
    at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:266)
    at com.sap.engine.services.httpserver.server.Client.handle(Client.java:95)
    at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:175)
    at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
    at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
    at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
    at java.security.AccessController.doPrivileged(Native Method)
    at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
    at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    Can anyone assist as to what is getting wrong?
    I have a doubt in specifying the parameters "dumpFolder" & "templateFolder" in catalogtool XCM settings, We need to pass full path of the server directory here. please suggest what exactly has to be passed here.
    Currently i am passing values like this "/catalog/dump"& "/catalog/templates".
    We are running on CRM-7.0
    Thanks in advance.
    Pnakaj.

    Any Updates friends??

  • Linux Monitor Agent Web Console Preferences Won't Save!

    Have recently installed GW Monitor on a Linux box and all seems fine except for the following:
    Within the Monitor Agent Web Console I browse to menu items 'Preferences' and then select 'Setup' to view details on the 'HTTP Settings', 'SNMP Settings', 'HTTP Settings' etc.
    From here all my settings are configured correctly except the texboxes under the 'Notify' and 'HTTP Settings' headings refuse to save any details that are entered into them, even after clicking on the 'submit' button, bizarre!!?
    Any ideas would be really appreciated.
    Cheers, James

    jim-bob,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://forums.novell.com)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://forums.novell.com/faq.php
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • HP Network Printer periodically prints a page from a web crawler

    I support a HP Color LaserJet CP2025dn which ocassionally prints a page that says:
    GET http://www.baidu.com/ HTTP/1.1
    Host: www.baidu.com
    Accept: */*
    Pragma: no-cache
    User-Agent:
    Sometimes the GET HTTP/1.1 statement is from  http://www.sciencedirect.com.
    I'm guessing that this is caused by some sort of web crawler that hits port 9100 at this printer's network address. Is there any setting on the HP Color LaserJet CP2025dn that I might make to stop these pages from printing?
    [ I know that blocking this port at a router may solve the issue, but I'm looking for a solution that fixes the issue at the printer, thank you. ]
    Thanks for your help!
    -Ken

    Waynely,
    Simple Explanation:
    Internet users, likely Chinese, scan addresses (computers, or other devices that are networked) on the Internet that accept certain connections. They do this to probe for vulnerabilities to exploit, or to route traffic through machines as a “proxy.” Being a proxy means loading and sending information at the request of another. This can be used to bypass local security, such as China’s severe Internet censorship. It can also be used to hide traces back to the source, such as when a criminal doesn’t want to be located when hacking.
    The printers are set up such that they can communicate outside the local network. This allows printing from other networks, but can be located by Internet users doing scans. Printers appear to these scans as a possible proxy. The printed pages are the commands sent as a test of possible proxies for use. The command means “tell this device to load this website and report back.” The printer interprets the request as a print job and prints out the given command.
    This needs a fix, but is not critical. There is risk to lost supplies. The pages are also a nuisance. There is a smaller risk of a printer vulnerability being exploited for more serious use. This would be a complex and targeted attack instead of a broad scan.
    Possible Solutions:
    1) Block this traffic with a firewall
    2) Block this traffic with printer access controls
    3) Update the firmware on the printer to block this
    4) Change the network configuration to not be accessible from the Internet.

  • Trouble with Web Crawler

    Hi all,
    I am trying to run the crawler application availabe on Sun's website at the URL: http://java.sun.com/developer/technicalArticles/ThirdParty/WebCrawler/#demo
    The problem is it would not return any results to me and I have no clue why this is happening. I have turned off the firewall etc. Will turning the robotSafe feature be of help? Any help with this will be much appreciated.
    On the same note, I want to use or rather, use the API of a web crawler. Anyone has something good and easy in mind. I don't plan to use any fancy features.
    Thanks,
    Harsh.

    Most likely caused by a corrupt cookie.
    Delete all cookies for that website, and also delete your bookmark for it. Close Safari.
    Re-open Safari and type in the URL for that site. This will create a fresh cookie.
    Any better?

  • Web Crawler Proxy settings

    Hi,
    Am trying to do a web crawl for some internet sites and am providing the required proxy properties as well as the crawl.scope.mode as ANY in the site.xml.
    When i start the crawl it tries to fetch the seed URL's and it is not able to fetch any of the seed URL's and it gets completed after retiring the URL's based on the depth of the crawl. This happens the same in my development box(UNIX) as well as the local(Windows). Please help me out asap. I am stuck with a deadline and am not able to find any information with respect to this in any other forums or even in the archives here.
    Thanks & Regards,
    Karthik V

    Hi Pankaj,
    I have checked it and i am able to ping the site using wget siteURL and am able to get the response.
    It is also not redirecting to https. The major problem is that if I try crawling without the proxy server I am able to crawl and if I try to connect through with proxy I am getting all the sites blocked.
    Thanks & Regards,
    Karthik V

  • NAC AGENT WEB Your Login session Failed { status = 5 }

    Hi,
    I have a problem with NAC agent web, did someone seen this error before ?
    Your Login session Failed  { status = 5 }
    I tested all these following , and all are Ok :
    • Test using another browser, Firefox for example
    • Test using another operating syste
    • Check if there any restrictions between the user vlan and nac vlans
    Thnx

    Hi.
    Can you paste all the ACLs on your switch especially the webauth redirect ACL which should deny traffic towards the PSN.
    regards
    Zubair

  • Web Crawler development or 3rd Party web crawler integration in portal

    Hi Experts,
    I am working on eCommerce based applications using SAP CRM. Web Crawler in XCM is not working as expected like publishing the content to search engines, indexing the product or catalog information etc.
    Is there anyway we can integrate 3rd Party Web Crawlers with SAP CRM or XCM application.
    Please let me know your inputs at the earliest.
    Thanks,
    Madhan

    Hi Madhan,
    we're facing the same issue, did you have any hints about that?
    many thanks
    BR
    Roberto

Maybe you are looking for

  • TS1398 I am unable to connect iPhones to our home wifi. Other devices work fine.

    Recently, I was having issues with our D-Link Dir-665. I did a factory reset to it on the routers settings (ie. 192.168.0.1). I was able to succsessfully connect my laptop, as well as our Kindle Fire HD. However, when trying to connect our iPhones (4

  • How to Change Database parameters as suggested by EWA report

    Dear Experts, As per EWA report , i have been asked to change several DB parameters as per note 0124361 . Should i go aghead and change the parameters ? What is the procedure of chaging those parameters? what is the real process that should be follow

  • Bitmap join indexes in 9i?

    Has anyone successfully used bitmap join indexes? I've created several over what is logically a fact table and dimension tables (fits the model as best I can tell), but queries that sure look like they should use the index don't. I've tried it on 9.0

  • Trying to downgrade Mountain Lion to Lion

    hey team:  I've been attempting to downgrade Mountain Lion 10.8.5 to Lion for a client's Mac Pro machine and am having a hard time doing something that I've done easily in the past. The Mac Pro came with Lion installed, but for some reason I advised

  • Stack or array ?

    hi, i have 4 integers like int a = 30; int b = 57 ; int s = 36 ; int d = 30; int total = a + b + c + d ; and i want to random them to get 6 in an array like array[0] = a array[1] = c array[2] = b array[3] = d array[4] = a array[5] = b now we got 2x a