Scms for file manipulation (knowledge provider) documentation

Hi,
Do you have any code sample or documetation on the SCMS calls/API to manipulate files on the content server ?
Sincerely,
Olivier Matt

Hi Olivier,
what exactly do you need?
Best regards
Torsten

Similar Messages

  • Name for the edit.template files channel or provider?

    Hi,
    I'd like to modify the default error page that shows the text:
    " A serious error has occured in the Desktop. This may have been caused by a mis-configuration on the server.
    Please report this problem to your administrator."
    I know where the files are located:
    <sunwps>/desktop/default/Error
    But the display profile document that supports these files is not located there. Does anyone know the name of the provider or channel that backs these template files?

    The provider is called ErrorProvider, and a reference to this provider is hard-coded into the DesktopServlet, i.e., there is no definition for this provider in the display profile. This is a special case so that errors can be handled even if there is no display profile.

  • How to Provide search Help for files on Application Server

    Hi Guys,
                   Can anyone tell me How to Provide search Help for files on Application Server. I have put a file name on selection screen. I want to give search help for files on application server.
    Thanks & Regards.
    Harish.

    Hi Harish,
    Use the following code,
    tables sxpgcotabe.
    data: lt_execprot LIKE btcxpm OCCURS 0 WITH HEADER LINE,
            w_filepath(60)       TYPE c, --> length depending on your Directory path.
      SELECT SINGLE *
        FROM sxpgcotabe
             WHERE name = 'LIST_DB2DUMP'
               AND opsystem = sy-opsys.
      IF sy-subrc <> 0.
        SELECT SINGLE *
          FROM sxpgcotabe
               WHERE name = 'LIST_DB2DUMP'
                 AND opsystem = 'UNIX'.
        IF sy-subrc <> 0.
          MESSAGE e000 WITH 'External operating system command '
                            'LIST_DB2DUMP not found'.
        ENDIF.
      ENDIF.
      sxpgcotabe-parameters = p_filepath. --> provide the directory path.
      CALL FUNCTION 'SXPG_COMMAND_EXECUTE'
           EXPORTING
                commandname                   = sxpgcotabe-name
                additional_parameters         = sxpgcotabe-parameters
                operatingsystem               = sxpgcotabe-opsystem
           TABLES
                exec_protocol                 = lt_execprot 
           EXCEPTIONS
                no_permission                 = 1
                command_not_found             = 2
                parameters_too_long           = 3
                security_risk                 = 4
                wrong_check_call_interface    = 5
                program_start_error           = 6
                program_termination_error     = 7
                x_error                       = 8
                parameter_expected            = 9
                too_many_parameters           = 10
                illegal_command               = 11
                wrong_asynchronous_parameters = 12
                cant_enq_tbtco_entry          = 13
                jobcount_generation_error     = 14
                OTHERS                        = 15.
      IF sy-subrc <> 0.
        MESSAGE e000 WITH text-e01 p_filepath.  "Directory failed
      ENDIF.
    Loop round the directory list, split each line up into a line table
    and get the last data for each line, should be the filename
    Then build the dirlist.
      REFRESH t_dirlist.
      LOOP AT lt_execprot.
        REFRESH t_dirline.
        SPLIT lt_execprot-message AT space INTO TABLE t_dirline.
        DESCRIBE TABLE t_dirline LINES w_nolines.
        READ TABLE t_dirline INDEX w_nolines.
        MOVE t_dirline-data TO t_dirlist-filename.
        APPEND t_dirlist.
      ENDLOOP.
    Here you will get all the files in the directory mentioned in Application server.
    For displaying them as a Search help use the FM '/BMC/ZPOPUP_GET_VALUE'
    Pass the Internal table to this FM.
    Regards,
    Paul.

  • FTPS or SFTP for file scenario. Suggstions

    Hi,
    I have searched blog in sdn but do not get good blogs/links.
    For File scenario which to use FTPS or SFTP.
    How to do the configuration in XI and Visual admin.
    Full points will be awarded.

    Hi,
    1) SFTP (Secure File Transfer Protocol)
    "SSH File Transfer Protocol" or SFTP is a network protocol that provides file transfer and manipulation functionality over any reliable data stream. It is typically used with the SSH-2 protocol to provide secure file transfer. SFTP encrypts the session, preventing the casual detection of username, password or anything that is being transmitted. One key benefit to SFTP is its ability to handle multiple secure file transfers over a single encrypted pipe. By using a single encrypted pipe, there are fewer holes in the corporate firewall.
    SFTP:
    As per the latest SAP PI/XI support pack, it does not support SFTP via File Adapter.
    So alternative approach to cater this requirement from XI is to make use of Unix Script at OS level to transfer the files from/to third-party systems.
    Inbound Interface - i.e. third-party system ->XI->SAP: 
    File is transferred to a folder in SAP XI landscape from the third-party legacy system using UNIX Script with secured protocol. Once the file is ready in the XI landscape, File Adapter will poll this directory and file is picked up by NFS protocol.
    Outbound Interface – i.e. SAP->XI->third-party system: 
    XI is responsible for writing a file into a folder in the XI landscape. These files are transferred to the third-party system by executing UNIX scripts with secured protocol i.e. via sFTP.
    Pre-Requisites: 
    Public key should be exchanged between external systems and the PI system.
    UNIX shell script has to be developed and scheduled.
    Advantages: 
    Highly Secured.
    Ability to handle multiple secure file transfers over a single encrypted pipe .By using a single encrypted pipe, there are fewer holes in the corporate firewall.
    Disadvantages:
    Two-Step process i.e. XI>Temporary folder>External System and vice-versa
    Files have to be temporarily stored in XI server.
    Multiple failure points i.e. XI and Unix script execution
    Maintenance of an external UNIX script.
    Difficulty in monitoring the execution of the shell script as it cannot be monitored thru XI.
    Need to generate keys and install it in the SFTP site as a pre-requisite i.e. SFTP clients must install keys on the server.
    SFTP uses keys rather than certificates. This means that it can't take advantage of the "chains of trust" paradigm facilitated through Certificate Authorities.
    Files from the XI server should be deleted/archived in a periodic manner to increase the disc space so that it will increase the performance.
    Note: UNIX shell Script can be executed as a background job ‘or' can be triggered from SAP XI through OS command at File adapter level.
    Secure FTP (SSH) with the FTP Adapter
    Secured File Transfer using SAP XI
    Secure FTP in SAP XI
    SFTP (FTP over SSH) in XI
    /people/krishna.moorthyp/blog/2007/07/31/sftp-vs-ftps-in-sap-pi
    encryption adapters or how to secure data
    /people/krishna.moorthyp/blog/2007/07/31/sftp-vs-ftps-in-sap-pi
    Regards,
    Phani
    Reward points if Helpful

  • FileDownload from Content-Server (Knowledge Provider)

    Hi,
    I am trying to download a File via FileDownload UI from the Conent-Server. The Files are from Type: Knowledge Provider (KPro).  I get the following InputStream (Character from 01-12 + Binary from 13-17) : I need only the binary part, the lines from 13 - 17.
    01 --ejjeeffe0
    02 Content-Type: image/gif
    03 Content-Length: 3692
    04 X-compId: marketing.gif
    05 X-Content-Length: 3692
    06 X-compDateC: 2007-04-12
    07 X-compTimeC: 16:05:24
    08 X-compDateM: 2007-04-12
    09 X-compTimeM: 16:05:24
    10 X-compStatus: online
    11 X-pVersion: 0045
    12
    13 GIF89a@ @ ÷ÿ ÎÎΫ««œœœÕÕÕ<;;ƒƒƒzzz
    14 LKLaaauuuää䤤¤ÉÉɹ¹º
    15 ¾¾¾jjjôôôèèèîîîØØØðððøøø••–êêêòòòìì썍ööö
    16 ìêíæçèôùöõòñâÞÝÝÙÞûúüðíîôðõò
    17 £££ÜÛßÝÚÚàáâ™
    18 ejjeeffe0
    How is it possible to separate the two parts when I need only the lines from 13 - 17? I can not get the right InputStream ( lines 13 - 17) back. My main problem is to read the stream and convert it back to InputStream.
    regards,
    Sharam

    Hello,
    try the  LineNumberReader
    http://java.sun.com/j2se/1.4.2/docs/api/java/io/LineNumberReader.html
    it keeps track of line numbers for you, so you could read from line 13.
    Jan

  • Building an Admin Console Extension for a Custom Security Provider

    I am looking for an example or a description how to build an Administration Console extension for a custom Authentication Provider.
    Especially the creation page for the provider is interesting because I am not able to create and register the required Authentication Provider MBean.
    The call “mbeanHome.getMBeanServer().createMBean(className,objectname)” always throws the following Exception “javax.management.ReflectionException: The MBean class could not be loaded by the default loader repository”
    Even if I try the class “weblogic.security.providers.authentication.IPlanetAuthenticator”, that is part of the bea distribution, the same exception is thrown.
    It seams that the Problem has something to do with class loaders?
    When I use the standard admin console pages to create and configure my provider everything works fine.
    The only example “kennedy0208.zip” I found in the net does not deal with the creation of the MBean.
    It only customizes the edit pages for the provider and at that point the MBean has already been created by the standard admin pages.
    Maybe the author discovered the same Problems and gave up!?
    What makes me wondering is that I have to put my MBean Classes to my console extension war file to be able to import the packages in my jsp.
    If I not put the classes to my war the compiler throws an exception because he can’t resolve the package.
    Because I moved my provider implementation jar to the directory “WLHOME\server\lib\mbeantypes” as described in the bea documentation it should run without putting the classes to the war!?!
    I am very surprised that the bea documentation does not provide any example about this topic.

    Found it. Cut and paste error. I still had one of their example class
    names in my code. Oops.

  • Reg: Knowledge Provider

    Hi Experts
    I'm new to DMS.
    pl explain
    1) knowledge provider
    2)Content Server
    3)cache server
    4) repository settings
    and process flow of DMS
    Thanks & Regards
    kumar

    Hi,
    1. Knowledge Provider :-  The service which provides link between SAP server & content server. Through this service you can link content server with TREX server.
    2. Content Server :- Server where you the original files (Word,Excel...) are stored. It is using MAX DB database.
    3. Cache Server :- Server which stores some last visited documents. (eg: same as like your cache memory of computer).
    4. Repository :- The logical storage identification where you are storing original documents.
        Repository Settings :- First create storage system through OAC0 transaction. Then create system category through OACT transaction assign the storage system to storage category.
    When check-in the document through transaction CV01N select the Storage System.
    Process flow of documents mins: In one department for person are working. One person is creating documents & his senior checking the documents & HOD of department release the documents. So you can use document Status functionality to create this kind of flow of document.
    Regards,
    Sunny

  • Cannot publish Flash Updates Verification of file signature failed for file SCUP 2011, SCCM 2012 R2 and WSUS all on same Windows Server 2012 machine

    I am attempting to distribute Adobe Flash updates using SCUP 2011, SCCM 2012 R2, WSUS ver4 and Windows Server 2012.  Everything installs without error.  I have acquired a certificate for SCUP signing from the internal Enterprise CA.  I have
    verified the signing certificate has a 1024 bit key.  I have imported the certificate into the server's Trusted Publishers and Trusted Root CA stores for the computer.  When I attempt to publish a Flash update with Full content I receive the following
    error:
    2015-02-13 23:00:48.724 UTC Error Scup2011.21 Publisher.PublishPackage PublishPackage(): Operation Failed with Error: Verification of file signature failed for file:
    \\SCCM\UpdateServicesPackages\a2aa8ca4-3b96-4ad2-a508-67a6acbd78a4\3f82680a-9028-4048-ba53-85a4b4acfa12_1.cab
    I have redone the certificates three times with no luck.  I can import metadata, but any attempt to download content results in the verification error.
    TIA

    Hi Joyce,
    This is embarrassing, I used that very post as my guide when deploying my certificate templates, but failed to change the bit length to 2048.  Thank you for being my second set of eyes.
    I changed my certificate key bit length to 2048, deleted the old cert from all certificate stores, acquired the a new signing cert, verified the key length was 2048, exported the new cert to pfx and cer files, imported into my Trusted publishers
    and Trusted Root Authorities stores, reconfigured SCUP to use the new pfx file, rebooted the server and attempted to re-publish the updates with the following results:
    2015-02-16 13:35:44.006 UTC Error Scup2011.4 Publisher.PublishPackage PublishPackage(): Operation Failed with Error: Verification of file signature failed for file:
    \\SCCM\UpdateServicesPackages\a2aa8ca4-3b96-4ad2-a508-67a6acbd78a4\3f82680a-9028-4048-ba53-85a4b4acfa12_1.cab.
    Is there a chance this content was already created and signed with the old cert, so installing the new cert has no effect?  In ConfigMgr software updates I see 4 Flash updates, all marked Metadata Only (because they were originally published as "Automatic." 
    No Flash updates in the ConfigMgr console are marked as downloaded.  I can't find any documentation on how the process of using SCUP for downloading content for an update marked Metadata Only actually works. 
    Comments and suggestions welcome.

  • URGENT! File Upload Utility or a Custom UI for File Upload is needed!

    Hi all,
    I'm trying to find or develop a file upload utility or custom user interface rather than editing and adding file type item to a page. There is a free portlet for file upload in Knowledge Exchange resources, but it is for 3.0.9 release. I'm using portal 9.0.2.
    I could not any sample about the new release. Also API such as wwsbr_api that is used to add item to content areas is out dated.
    I create a page with a region for "items". When "edit" page and try to add an "file" type item the generated url is sth like this;
    "http://host:7779/pls/portal/PORTAL.wwv_additem.selectitemtype****"
    After selecting item type as a simple file autogenerated url is sth. like ;
    "http://host:7779/pls/portal/PORTAL.wwv_add_wizard.edititem"
    I made search about these API but could not find anything (in PDK PL/SQL API help, too).
    I need this very urgent for a proof of consept study for a customer.
    Thanks in advance,
    Kind Regards,
    Yeliz Ucer
    Oracle Sales Consultant

    Answered your post on the General forum.
    Regards,
    Jerry
    PortalPM

  • Java WebDynpro's and IE security settings for file download

    We have a EP 7.0 SP13 environment on which we have deployed a number of own developed java webdynpro's. In some of these webdynpro's we provide the file download functionality. The portal and webdynpro's are used by both internal personnel and external customers.
    On the other hand the default Internet security settings for Internet Explorer, disable "Automatic prompting for file downloads".
    When a user, with these default security settings active, tries to use our webdynpro's file download functionality, the screen seems to refreshes but no file download starts. When (s)he retries, the session runs for some minutes and gives following error message:
    "com.sap.tc.webdynpro.services.session.LockException: Thread SAPEngine_Application_Thread[impl:3]_20 failed to acquire exclusive lock on client session ClientSession".
    This behavior is explained in SAPNote 1234847. Webdynpro provides a single-thread module, meaning a user session is blocked for the during of the request. And because the previous file download isn't yet completed the new try can't start.
    Issue now, although the users IE settings allow file downloads and don't block pop up's, he can't download the file and even isn't made aware of the cause of the failure.
    How can we avoid this issue, without having to communicate the
    application requires specific browser settings?

    Welcome to the Apple Support Communities
    See > http://support.apple.com/kb/HT5290
    You can install the program using different ways:
    1. Right-click the application installer and choose Open.
    2. Go to System Preferences > Security and Privacy and select Anywhere in Allow applications downloaded from

  • Peformance Turning for File Download / Upload with Enabled SharePoint Audit

    Greetings all, may I ask your help for Peformance Issues?
    Background:
    I tried to create a ASP.NET Web Page to download/upload/list SharePoint file and deployed to IIS website in same application server (will NOT use web part as some users are NOT allowed to direct access confidential workspace)
    Besides, for Audit Log record purpose, the page will impersonate (without password) the logged in user:
    SPUserToken userToken = web.AllUsers[user].UserToken;
    SPSite s = new SPSite(siteStr, userToken);
    For File Listing, the web service can provide fast response, and we are using service A/C for connection (as no auting for listing, but require audit for file download upload)
    Several implemeation options tested for File Downloiad / Upload, but issues occured and finding listed below:
    Issues
    1) SharePoint Object Model
    When I open Site (using new SPSite), it's too slow to respond. (under 1s for all operations, but require 10~50s for open SPSIte. e.g.
    using(SPSite s = new SPSite(siteStr) //50s
    How can I download/upload file without open SPSite object (using SharePoint object model, but user token should be kept to allow SHarePoint identifiy user actions. e.g. Updated by Tom, NOT system administrator)?
    2) SharePoint default web service
    For file download, I tried to use SharePoint Web Service for download file, it's quick but how can SharePoint record the audit log to downloaded user, and not service A/C? ( e.g. View by Tom, NOT system administrator)
    With Windows SSO solution, please note system should NOT prompt to ask user password for use impersonation
    3) HTTP Request API (for file download)
    As mentioned in point 2, if the system cannot get password from user, SharePoint also recorded service A/C in audit log... ><
    Thank you for your kine attention.
    .NET Beginner 3.5

    Thank you for prompt response, please find my reply with Underline:
    Hi,
    Maybe I'm not quite clear about the architecture you have now.
    Is your asp.net application deployed in separate IIS site but in the same physical server as SharePoint?
    Yes
    "we are using service A/C for connection", can you please explain the 'A/C'?
    Domain User, Local Admin and also SharePoint Service Admin A/C
    Opening SPSite is relatively slower but shouldn't take 50 sec. However it depends on your server hardware configuration. You should meet the minimum hardware requirements for SharePoint.
    Assigned double resources based on minimum hardware requirements.
    For details, 50s is the load test result. But for other SharePoint operation, it takes around/under 3s reponse time.
    Are you using SharePoint Audit log? Exactly If so then why don't you just put the hyperlink to the documents in your asp.net page. User maybe have to login once in SharePoint site but
    it depends on your security architecture. For example if both of your sites in local intranet and you are using windows integrated authentication, SSO will work automatically  User is NOT allowed
    to access SharePoint site/server (not implemented for sepreate server yet, as performance issues occured for
    separate site in same server)  directly from Internet, the
    middle server with web interface created for user request.
    Whatever I understands gives me the feeling that you download the file using HTTPWebRequest C# class. However regarding security it depends on how authentication is setup in asp.net web site and in sharepoint. If both site uses windows integrated security
    and they are in the same server, you can use the following code snippet:
    using (WebClient webClient = new WebClient())
    webClient.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
    webClient.DownloadFile("file ur in sharepoint", "download directory");
    Thanks, will try and reply later
    But still, as I've mentioned, not quite clear about the whole architecture.
    A) Request Handling
    1) User use browser to request file listing and/or download file (hereafter called: File Download Request) from custom ASP.NET page (Let's say In Server 1, IIS 1)
    2) ASP.NET page  or File Handler ashx (Server 1, IIS 1) call custom web service, SharePoint deault/OOTB web service or using SharePoint Object Model to access SharePoint Document Library (in Server 1,
    IIS 2)
    3) Both SharePoint and IIS Web Site
    (Server 1, IIS 1 & IIS2) using the same service A/C
    4) The web service , File Handler return file obeject to IIS1
    5) IIS 1 reply File Download Request to user
    B) Application Architecture (In testing environment)
    1) no load balancing
    2) 1 DB server (Server 2)
    3) 1 Application Server (with IIS 1 - ASP.NET, IIS 2, SharePoint Web Site with default SharePoint Web Service, IIS 3 SharePoint Admin Site & IIS 4 Custom SharePoint Web Service)
    4) Sepreate AD Server (Server 3)
    Thanks,
    Sohel Rana
    http://ranaictiu-technicalblog.blogspot.com
    .NET Beginner 3.5

  • N:1 Multimapping For Files

    Hi All,
    I am doing N:1 multimapping for files.
    I have 2 senders and 1 receiver.
    I created 2 sender interfaces and 1 receiver interface.
    I did message mapping in which I added 2 sender message types in source and added the target message type.
    I did Interface mapping with 2 sender message interface and 1 receiver interface.
    But i faced problem when creating configurations in ID.
    (a) I am not able to understand how many interface determination/Receiver determinations are required.
    (b) Again I am not getting interface mapping in Interface Determination .
    So can anyone help me to solve this?
    Thanks
    Rabi

    Hi,
    You can use any of the collect message pattern provided. In your case any one of the following can be used:
    1. BpmPatternCollectMessage
    2. BpmPatternCollectTime
    ->There are some patterns in SWC : SAP BASIS --> namespace : http://sap.com/xi/XI/System/Patterns In ESR of your SAP PI System.
    -> please go through this blogs it will help you.
    Correlation – Runtime Behavior of BPM.
    http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/10526. [original link is broken] [original link is broken]
    In Configuration By using of BPM.
    -> two communication channels one for sender and one for receiver
    -> 2 receiver Determinations
              one for Sender to IP.
                One for IP to Receiver.
    -> 2 Interface Dtermination
             one for Sender to Ip
             one for IP two Receiver
    -> 1 Sender agrement.
    -> 1 receiver agrement for (IP to Receiver)
    regards,
    ganesh.

  • I do not have a delete option for files or folders in Adobe Creative Cloud

    I do not have a delete option for files or folders in Adobe Creative Cloud
    im looking and looking....
    im stumped.....
    4 weeks now....
    kai

    HOW TO DELETE FILES or FOLDERS or Assets from Adobe Creative Cloud Brouser/Web Portal By: Kai Buskirk rev:130626
    Adobe now burried deleting or trashing unwanted iteams files or folders in the Archive section of your creative cloud brouser/web portal.
    Note!! - No longeer is there a standard TrashCan icon or simple Delete button... its burried in the archive sector: but why i ask?
    An archive is an accumulation of historical records, or the physical place they are located.[1] Archives contain primary source documents that have accumulated over the course of an individual or organization's lifetime, and are kept to show the function of that person or organization. Professional archivists and historians generally understand archives to be records that have been naturally and necessarily generated as a product of regular legal, commercial, administrative or social activities.
    In general, archives consist of records that have been selected for permanent or long-term preservation on grounds of their enduring cultural, historical, or evidentiary value. Archival records are normally unpublished and almost always unique, unlike books or magazines for which many identical copies exist. This means that archives (the places) are quite distinct from libraries with regard to their functions and organization, although archival collections can often be found within library buildings
    A person who works in archives is called an archivist. The study and practice of organizing, preserving, and providing access to information and materials in archives is called archival science. The physical place of storage is sometimes referred to as an archive repository.
    To delete files folders or individual assests in the current incarnation of adobe creative cloud brouser/web portal rev:130626
    1 - Check mark the box on left and select files or folders you would like Deleted/Trashed and move them to the ARCHIVE folder location inside your adobe creative cloud brouser/web portal....Then navigate to the ARCHIVE SECTION
    2 - Once the files or folders you have check marked are moved to the ARCHIVE folder location you can select them for Permanant Deletion (Trash)
    ps: you can also restore them......if you so choose....
    3 - in case you missed this step after selecting/checking the files or folders in the ARCHIVE folder there is a small Triangle Selector drop down that will reviel the Permanently Delete option.... clicking that is the point of no return i think..... so do not be misled my the use of the term ARCHIVE.... DELETING PERMANANTLY IS DELETING YO!
    4 - OK DONE NOW YOU GOT IT .....
    Good Luck Happy House Cleaning.....
    Wamest Blessings,
    Kai Buskirk
    Message was edited by: [email protected] rev: 130626

  • URM Adapter for File System issue.

    Hi, I am just starting out on using the URM Adapter for File System and I have a few questions about issues I am facing.
    1.     When I try to create multiple searches and map them to Folders/Retention Categories in URM, it does not work. I am able to map one search via one URM source to one Folder/Retention Category (without my custom attribute from question 1). However in Adapter’s Search Preview I am able to perform a search on the documents successfully. Would different searches require different URM sources in Adapter?
    2.     Does the adapter work with other Custom Attributes? I have added an attribute in addition and in the same way as "URMCrawlTimeGMT" is added in Oracle Secure Enterprise Search (I created a custom Document Service and Pipeline to add a metadata value) and in the URM Adapter’s config.properties file and when I create a search in Adapter based on the custom attribute, it does not map the documents into URM. I am however able to search the documents in Adapter’s Search Preview window with the custom attribute displaying correctly.
    Any help with this topic would be really appreciated. Thank you.
    Regards,
    Amar

    Hi Srinath,
    Thanks for the response, as to your questions,
    1. I am not sure how to enable Records Manager in adapter mode. But I am able to login to the Records Manager web page after starting it up through StartManagedWebLogic.cmd URM_server1.
    2. The contents of the file system should be searchable in Records Manager, and should be able to apply retention policies to the documents in the file system, I do not need to have SES, but apparently the adapter needs to have SES as a pre requisite.
    Upon further investigation I found that in the AGENT_DATA table the values being inserted were "User ID"(UA_KEY) and NULL(UA_VALUE), so I just made the UA_VALUE column nullable and I was able to pass that step. Is this the wrong approach to fix the issue.
    Could you please let me know about enabling Records Manager in adapter mode, I am not able to find documentation online, I have been through the Adapter installation and administration guides. Thank you once again.
    Regards,
    Amar

  • Scan for file and report back if it exists

    Hello,
    I am new to scripting and trying to find a way to search for a supposed virus file named wsr[any two numbers]zt32.dll on a large group of computers. I'm not sure how or where to use Get-ChildItem -Path C:\Users -Filter ?zt32.dll -Recurse | export-csv
    C:\scripts\output\test.csv in this script. 
    The below script keeps erroring out when I try messing with the Test-Path options and I'm not sure where to use Get-ChildItem or whatever to get this working. Thanks
    # Edit these variables to fit your enviroment
    # Set file to be tested for, put everything after c:\
    # “c:\Users\Default” is the example path
    $filetofind = ‘wsr*zt32.dll ‘
    # Hostnames TXT Location
    $hostnamestxt = ‘C:\scripts\computernames.txt‘
    # Destination file for Online Machines
    $onlinetxt = ‘C:\scripts\output\Machines_with_file.txt‘
    # Destination file for Offline Machines
    $offlinetxt = ‘C:\scripts\output\Offline_Machines.txt‘
    # Begin Executing Script – Do Not Edit Below This Line
    $computers = get-content “$hostnamestxt”
    write-host “———————————————-”
    write-host “Scanning hostnames from $hostnamestxt…”
    write-host “———————————————-”
    foreach($computer in $computers)
    ping -n 1 $computer >$null
    if($lastexitcode -eq 0)
    if(test-path “\\$computer\c:\users\* -include $filetofind”)
    echo “$computer” | Out-File -Append “$onlinetxt”
    write-host “File FOUND on $computer”
    else
    {write-host “File NOT found on $computer”}
    else
    echo “$computer” | Out-File -Append “$offlinetxt”
    write-host “$computer is OFFLINE/DID NOT RESPOND TO PING”
    write-host “———————————————-”
    write-host “Script has completed please check output.”
    write-host “Hosts with file output location – $onlinetxt”
    write-host “Hosts that were unpingable output location – $offlinetxt”
    write-host “———————————————-”

    Although this works, it appears to be very slow. Also, the Offline machines are not getting logged. Is there a way to speed this up? I am reading about how gci is slow over UNC, but I'm going to have to research this more. Thanks
    This makes the third time you have demanded someone custom build a solution for you.  You need to step back and think about what you are doing.  The solution was provided as you asked for it.  You lack of technical experience led you to ask
    fot a now unworkable solution so you are asking for more free consulting and a new solution.
    As Bill has pointed out this should be done with AV software as just finding the fiole will accomplish nothing.  If your system is infected you need to rake more aggressive steps and you should not be trying to write a scripted solution for this kind
    of thing aunless you have the technical sjkills to understand what it is you are doing.
    All remote scan methods are very slow.  To do a local scan requires emoting to be installed and that you know how to use it,  Once remoting is installed a single line will get you the file existence.  Adding an AsJob will get you concurrent
    scanning.  You will need to learn how to use PowerShell and WMF remoting to proceed with this.  An AV scanner 2would be more valuable and it would protect you in the future.
    Also as I posted before this is a very weak construct for file scanning on a network.
    $g = get-childitem \\$computer\c$\user\*
    -recurse -ErrorActionsilentlycontinue
    |? {$_.name
    -match "wsr[0-9][0-9]zt32.dll"}
    The following will be much faster.
    $g = get-childitem \\$computer\c$\user\* -Include
    wsr*zt32.dll -recurse
    -ErrorActionsilentlycontinue
    ¯\_(ツ)_/¯

Maybe you are looking for

  • MBP wakes up immediately after sleep

    So my MBP has been waking up from sleep right after I put it into sleep mode for some reason. According to the console it's Wake reason: XHC1. After it wakes up to the login screen it then goes to sleep normally if I leave it, but choose apple menu -

  • How to install an Officejet 8100 printer on a hybrid network?

    Hi, I want to buy an Officejet 8100 printer to use on my home network. My desktop PC has Windows 7 64 bits and is connected to a wireless router through an Ethernet cable. I have also a Windows 7 64 bits notebook, which access the home network throug

  • How to get Display to not "stretch" and therefore distort photos

    My wife has 20" Cinema Display hooked up to an IBM Thinkpad running Windows XP. Because the Display is "stretched" (extra wide), it distorts photos, making everything appear wider than it really is. Is there any way to get the display to not be stret

  • Not shutting my computer down?

    I want to stop using my computer for a period of time without shutting it down. But I dont want my macbook getting hot. Can this be done if I just log off or put it to sleep? Or will it get hot if I leave it on no matter what?

  • CSV File problem

    Hi , Our web based application generates some csv reports. It is working well from other machine. When i access those reports from my machine , all data is shown in one column. I found that the every row is written with in double quotes. why it gener