Scan for file and report back if it exists

Hello,
I am new to scripting and trying to find a way to search for a supposed virus file named wsr[any two numbers]zt32.dll on a large group of computers. I'm not sure how or where to use Get-ChildItem -Path C:\Users -Filter ?zt32.dll -Recurse | export-csv
C:\scripts\output\test.csv in this script. 
The below script keeps erroring out when I try messing with the Test-Path options and I'm not sure where to use Get-ChildItem or whatever to get this working. Thanks
# Edit these variables to fit your enviroment
# Set file to be tested for, put everything after c:\
# “c:\Users\Default” is the example path
$filetofind = ‘wsr*zt32.dll ‘
# Hostnames TXT Location
$hostnamestxt = ‘C:\scripts\computernames.txt‘
# Destination file for Online Machines
$onlinetxt = ‘C:\scripts\output\Machines_with_file.txt‘
# Destination file for Offline Machines
$offlinetxt = ‘C:\scripts\output\Offline_Machines.txt‘
# Begin Executing Script – Do Not Edit Below This Line
$computers = get-content “$hostnamestxt”
write-host “———————————————-”
write-host “Scanning hostnames from $hostnamestxt…”
write-host “———————————————-”
foreach($computer in $computers)
ping -n 1 $computer >$null
if($lastexitcode -eq 0)
if(test-path “\\$computer\c:\users\* -include $filetofind”)
echo “$computer” | Out-File -Append “$onlinetxt”
write-host “File FOUND on $computer”
else
{write-host “File NOT found on $computer”}
else
echo “$computer” | Out-File -Append “$offlinetxt”
write-host “$computer is OFFLINE/DID NOT RESPOND TO PING”
write-host “———————————————-”
write-host “Script has completed please check output.”
write-host “Hosts with file output location – $onlinetxt”
write-host “Hosts that were unpingable output location – $offlinetxt”
write-host “———————————————-”

Although this works, it appears to be very slow. Also, the Offline machines are not getting logged. Is there a way to speed this up? I am reading about how gci is slow over UNC, but I'm going to have to research this more. Thanks
This makes the third time you have demanded someone custom build a solution for you.  You need to step back and think about what you are doing.  The solution was provided as you asked for it.  You lack of technical experience led you to ask
fot a now unworkable solution so you are asking for more free consulting and a new solution.
As Bill has pointed out this should be done with AV software as just finding the fiole will accomplish nothing.  If your system is infected you need to rake more aggressive steps and you should not be trying to write a scripted solution for this kind
of thing aunless you have the technical sjkills to understand what it is you are doing.
All remote scan methods are very slow.  To do a local scan requires emoting to be installed and that you know how to use it,  Once remoting is installed a single line will get you the file existence.  Adding an AsJob will get you concurrent
scanning.  You will need to learn how to use PowerShell and WMF remoting to proceed with this.  An AV scanner 2would be more valuable and it would protect you in the future.
Also as I posted before this is a very weak construct for file scanning on a network.
$g = get-childitem \\$computer\c$\user\*
-recurse -ErrorActionsilentlycontinue
|? {$_.name
-match "wsr[0-9][0-9]zt32.dll"}
The following will be much faster.
$g = get-childitem \\$computer\c$\user\* -Include
wsr*zt32.dll -recurse
-ErrorActionsilentlycontinue
¯\_(ツ)_/¯

Similar Messages

  • Security Approach and Plan for single logon for Essbase and Reports.

    Please any one can suggest me, how can I do the Security Approach and Plan for single logon for Essbase and Reports by using Maxl or Administrator.If any one have code,please forward to my email Id: [email protected]

    Once you are logged in to the "Hyperion Portal" as you call it, your user credentials are automatically passed among all the components. Therefore, a lot of the logic you created to pass credentials between BQY files in a desktop environment are no longer needed.

  • Having trouble converting array to spreadsheet string, storing the file and coverting back to array with complex numbers

    I am working with a network analyzer. I have arrays made of 5 columns the first consisting of an integer and the next four consisting of complex numbers. I am converting the array into a spreadsheet string and then saving the file using the write characters to a file VI. That seems to work well as when I open the file in Excel all the data is there. However when I try to reverse the process, open file and convert back to array, I loose some of the data. Specifically the imaginary parts of my complex numbers are all going to zero. I have narrowed down the problem to be in the conversion from spreadsheet string to array and vice versa. I
    think the problem may be with the 'format' input to the VI. I do not have an adequate resource for this so I am not sure what to put in to accomplish my task. Any takers?

    Hi Biz
    I don't think there is a direct way of converting a complex number to a
    string, so when you convert the array to a spreadsheet string, the
    numbers would be converted to real data.
    However, you could try separating the real and imaginary parts using the
    "Numeric: Complex to Re/Im" function, and then store these - either in
    separate files or in adjacent columns/rows in the same file. Then, when
    you read in the data again, use the "Numeric: Re/Im to Complex" function
    to put the two "halves" together.
    If you actually want Excel to interpret the numbers as imaginary, then
    you'll probably want to create a string for each complex number of the
    form "Re + Im*i" (after separating the Re and Im parts), by using
    "String:Format into String" with 2 numeric inputs and the format string
    "%f+%fi".
    Reading the data back into Labview then would require splitting the
    string into the 2 pieces by using "Stringcan from String" with 2
    numeric outputs (smae precision as original numbers specified by the 2
    Default Value inputs) and the same format string "%f+%fi", and then using
    the above-mentioned "Numeric: Re/Im to Complex" function. It worked for
    me, so if you can't follow what I am describing, send me an email and I
    can email you what I did (LV 5.1.1).
    Paul
    Biz wrote:
    > Having trouble converting array to spreadsheet string, storing the
    > file and coverting back to array with complex numbers
    >
    > I am working with a network analyzer. I have arrays made of 5 columns
    > the first consisting of an integer and the next four consisting of
    > complex numbers. I am converting the array into a spreadsheet string
    > and then saving the file using the write characters to a file VI. That
    > seems to work well as when I open the file in Excel all the data is
    > there. However when I try to reverse the process, open file and
    > convert back to array, I loose some of the data. Specifically the
    > imaginary parts of my complex numbers are all going to zero. I have
    > narrowed down the problem to be in the conversion from spreadsheet
    > string to array and vice versa. I think the problem may be with the
    > 'format' input to the VI. I do not have an adequate resource for this
    > so I am not sure what to put in to accomplish my task. Any takers?
    Research Assistant
    School of Physiotherapy, Curtin University of Technology
    Selby Street, Shenton Park, Western Australia, Australia. 6008
    email: [email protected]
    Tel. +61 8 9266 4657 Fax. +61 8 9266 3699
    "Everyone who calls on the name of the Lord will be saved." Romans 10:12
    "For all have sinned and fall short of the glory of God, and are
    justified freely by his grace through the redemption that came by Christ
    Jesus." Romans 3:23-4

  • How to divide time capsule for Mac and PC back up

    Want to know if a cant use my time capsule to do back up for my MacBook and do manual back up for the PC of my wife

    How to divide time capsule for Mac and PC back up
    It is not possible to divide or "partition" the Time Capsule disk......unless you pull the hard drive from the Time Capsule (which voids the warranty), place it in a separate enclosure or caddy, connect the enclosure directly to your Mac, and use Disk Utility to partition the disk. Then, you must reinstall the disk back in the Time Capsule.
    This is a lot of work....best done by an experienced technician. The operation will likely cost more than the cost of the Time Capsule.
    Life would be tons easier if you used the Time Capsule with your MacBook and added a separate USB drive to the PC.
    If you do not need to divide or partition the Time Capsule storage space, and you are willing to have the Mac and PC share the same space, take a look at an application like Macrium Reflect to backup the PC to the Time Capsule disk.
    Still, having done this in the past, things would be much faster and easier for you if you use a separate USB drive for the PC.

  • Role Assignment Discovery Issue for Files and Folders through Sharepoint REST services

    To preface, I am a decided Sharepoint newbie in every sense. I am trying to use the Sharepoint REST services (Sharepoint 2013) to walk the folder and file structure of my Sharepoint server and, determine as I go, the Role Assignments (and subsequently
    Permissions) on those folders and files. I'm using an Administrator credentials and I'm actually able to successfully do it but I've run into some caveats. All the caveats begin with this; when I'm examining a folder, for example:
    /_api/Web/GetFolderByServerRelativeUrl('/sites/cmisdev/Development')/ListItemAllFields
    I receive either an empty list or an error response doc when following the link supplied for ListItemAllFields.  When following that kind of link for folders, I either get:
    <d:ListItemAllFields
    m:null="true"
    />
    or an error response document that says "The object specified does not belong to a list." When I hit the /ListItemAllFields endpoint for files, I receive a response with a link for Role Assignments which subsequently also works and I get the
    info I need. So, is this a bug? Why does the link returned from Sharepoint work for files and not folders? So, google, google, google, and I discover that there is another possible way to get at the Role Assignments (and that the object does, indeed, belong
    to a list!).
    If I know the Title (or the guid) of the folder in question, I can use the following endpoint:
    /_api/Web/Lists/GetByTitle('Development')
    If I use that endpoint, I get the information I would have expected to get from following /ListItemAllFields and the subsequent Role Assignments links all work and I get what I need. If there's a bug and this is how I have to work around it, that's fine
    but I have yet to discover how to dynamically determine the Title of a given folder nor am I sure if all Titles are supposed to be unique within a given Sharepoint server. I'm assuming that the folder name as represented in the server relative URL and the
    Title may be different and this is where my newbishness may start to shine if I'm misunderstanding what a "List" is supposed to be in Sharepoint. Anyway, I did find that I could use the Properties endpoint to perhaps get the Title, for example:
    /_api/Web/GetFolderByServerRelativeUrl('/sites/cmisdev/Development')/Properties
    gives me:
    <d:vti_x005f_listtitle>Development</d:vti_x005f_listtitle>
    whose value I assume I could then supply to the /GetByTitle endpoint and be golden. However, "vti_x005f_listtitle" just sounds a little too deep to be something I should be relying on but maybe that's kosher. That's part of what I'm trying to
    find out. Also, if there is a way to use the Sharepoint REST API to discover the guid of a given object, then I could look it up in that way.
    So, in summary:
    1. Am I going about getting folder Role Assignment information in the wrong way? Based on the CSOM examples I've seen, I believe I'm doing it correctly and that the answer to #2 below is a resounding "Yes!" :)
    2. Is it a bug if I'm not able to use /ListItemAllFields on folders using the server relative url?
    3. If I'm supposed to use GetByTitle as a workaround, am I discovering that Title correctly through /Properties? Seems quite circuitous and awkward. Are Titles required to be unique throughout a given Sharepoint server?
    4. If I'm supposed to use the guid, how can I use the REST interface to discover an object's guid? Once we get down to the Role Assignments and other links, the guid appears in those links but I don't know how to discover it independently if that's the
    path I should use to get the data I described above.

    Upon further research, I'll answer my own question for the benefit of some other potential future newbie.  The answer to question number 1 above is "Not exactly.".  The server relative URLs I was using corresponded to lists (which are
    returned as a collection through /_api/web/lists).  I was treating them mentally like regular folders.  That, coupled with the fact that accessing their data as I showed above returns a ListItemAllFields link, made me think that was the way to get
    the Role Assignments just as I would for files and, as it turns out, "real" folders and sub-folders created under these lists.  That was the other problem with thinking of these lists as regular folders.  So, ListItemAllFields works on
    all files and folders in a list.  However, if you want Role Assignments for the lists themselves, you can keep track of the Titles and\or Guids from the /_api/web/lists that you're interested in (in my case, all non-hidden "document library"
    type lists) and then access those Role Assignments as I discussed in questions 3 and 4 above.  For example, from the /_api/web/lists collection from my test server, the "Development" document library Role Assignments are accessable via /_api/Web/Lists(guid'cd242eeb-aafa-4efa-aecc-9bbdf8e3d459')/RoleAssignments
    or /_api/Web/Lists/GetByTitle('Development')/RoleAssignments.

  • I just downloaded itunes, and when i go to music there are two options. You can either go to the itunes store or scan for media, and when I click on scan for media it does nothing. Help?

    I just downloaded itunes, and when i go to music there are two options. You can either go to the itunes store or scan for media, and when I click on scan for media it does nothing. Help? (I tried deleting itunes and re-installing it but apparently itunes was still on my computer so it asked me if I wanted to just check and see if there were any issues or missing software, so I did that but i still had the same problem.)

    Hi jbserious,
    Welcome to the Support Communities!
    It doesn't sound like the iTunes software and its components installed completely.  The article below will walk you through some troubleshooting steps.  Is this your first time using iTunes, or do you have an iTunes library from another computer that you want to move to this one?
    Issues installing iTunes or QuickTime for Windows
    http://support.apple.com/kb/ht1926
    iTunes: How to move your music to a new computer
    http://support.apple.com/kb/HT4527
    Adding music and other content to iTunes
    http://support.apple.com/kb/HT1473
    Cheers,
    - Judy

  • Is it possible that theseproperties are only available for files and not f

    Hello All,
    I have created few properties like ReqNo (takes Integer) and ReqDate (accepts Date). 
    I have set the above 2 properties as manadatory. Created a Group and added his goug in 'allgroups' such that I see a tab wherein I can fill in these properties.
    My questions is:
    <b>Is it possible that these properties are only available for files and not for folders</b>. The reason being that even if I create a new folder, I need to fill some values in the above two mentioned properties.
    These properties need to be mandatory and cannot make them optional.
    Please help me solve this mystery.
    Awaiting Reply.
    Thanks and Warm Regards,
    Ritu

    Hi Ritu,
    If you only want the property to be available for files and not folders you only enter data into the "Document Validity Patterns" field in the Property config.
    Regards
    Paul

  • Advanced permission for files and folders

    Advanced permission for files and folders
    Hi,
    Just wanted to raise a quick query on setting a unique NTFS Security permission.
    My requirement
    A shared folder with the below listed access for users
    A group of users should be able to create, read, rename files and folders inside a shared folder.
    The group of users should not have the right to delete any folder or file from the shared folder.
    This is what I have tried. 
    Gave modify permission to the Security group.
    On Advanced permissions, denied delete subfolders and file & delete permission.
    The effective permission for a user who is member of the security group over a file inside the shared folder is as shown.
    https://onedrive.live.com/redir?resid=835A81FDD1D9D9FC!109&authkey=!AGQFP11QTFaLHQM&v=3&ithint=photo%2cpng
    But while trying to rename or modify the file, getting the below error message.
    https://onedrive.live.com/redir?resid=835A81FDD1D9D9FC%21110
    Any help to achieve my requirement would be really appreciated.
    Thanks,
    JD

    Hi JD,
    Removing delete permission from the user or group brings a limitation that the user will not be able to rename the folder. This is because of the reason that the "rename" operation is also included within the "Delete" permission.
    Thus if you want to prevent user from deleting the shared file, it's also not allowed to rename.
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Good source for tables and reports

    Hello!
    Does any one know a good source for tables and reports in SRM (EBP) I really
    looking to report on the Org Structure.
    But any information you can give me would be really helpful.
    Regards
    sas

    Hi,
    See these threads :
    SRM Tables
    SRM Tables
    Re: Availability of Standard Reports in SRM
    SRM standard reports?
    SRM Reports
    For developing custom reports,you can use the stanadard tables and Function modules.
    BR,
    Disha.

  • How to run a silent configuration for Forms and Reports 11g ?

    Hi all,
    I've run a silent installation for Oracle Forms and Reports 11g R2 in Windows 7 64 bit.
    I made it using a response file called install_only.rsp. Now I want to run the configuration also in silent mode.
    How can I do it ?
    I know that I can use a template called configure_only.rsp. However, i don't know what is the correct syntax for the command to start configuration.
    Should I use the same setup.exe installer and inform configure_only.rsp as response file ?
    The command below is the appropriate command for this task:
    setup.exe -silent -response configure_only.rsp
    Or is there another way to be followed?
    Thanks a lot.

    Alright, I eventually found the answer.
    Configuration must be run from directory on windows.
    Command is shown below:
    <FORMS_HOME>/bin/config.bat -silent -response <path-to-response-file>

  • Scanning .txt file and outputting results?

    Greetings Everyone. My employer has charged me with a rather
    confusing task. Basically I need to scan a .txt file and retrieve
    some information from it. Here is a little background on the file
    itself.
    This is a feed file containing the information for employees
    such as name, department, employement status etc...Zeros are used
    in place of spaces in this file. What I am charged with is
    retrieving the employment status and name for every single person
    in the file (60,000+).
    I need to write a coldfusion script that can do the
    following.
    1. Scan a .txt file that is sent to me every night and look
    for the 99th character on each line of the .txt file
    2. If the 99th character is a 'T' I need to pull characters
    '10-50' (which contain the name for that person).
    3. Output the results of the scan to a coldfusion page
    displaying the individuals' names.
    If anyone out there can point me in the right direction I
    would be very grateful. I've been looking for websites on this
    topic but I have been unsucessful so far. Should I post this in the
    advanced section of the cf forums? Once again, thank you for any
    help you can give me.

    I guarantee you can do this! And it shouldnt be to hard so
    you can breath a sigh of relief... :)
    I would use <cffile action="read" file="filepath/name.txt"
    variable="fileContents">
    Then you should be able to do something like <cfset
    fileArray = ListToArray(fileContents, "#CHR(13)##CHR(10)#")>
    Now you have an array so you can loop through and try
    something like the following...
    <cfloop index="i" from="1" to="#ArrayLen(fileArray)#"
    step="1">
    <cfif fileArray
    NEQ "">
    <!--- Find 99th Char --->
    <cfset 99thchar = Mid(fileArray, 99, 1)>
    <cfif 99thchar EQ "T">
    <!--- Get Name --->
    <cfset empName = Mid(fileArray
    , 10, 40)>
    <!--- Change 0's to Spaces--->
    <cfset empName = Replace(empName, "0", "#CHR(32)#")>
    <cfoutput>#empName#</cfoutput><br />
    </cfif>
    </cfif>
    </cfloop>
    That should be close to what you could use... You may have to
    tweak it a bit... now if it is going though 60000+ records this may
    take a while...lol You might have to use the cfsetting tag to
    extend the normal request timeout..
    Hope this helps!

  • Cleaning up data base files and  old back-up files

    I installed LR version 1 and reimported all new photos. I noticed that for some reason I have several folders of backup and libraries in my lightroom file folder some nested within others. I would like to remove the old files and keep only the latest ones. I am not sure I know how to identify the latest files to keep or delete. I then would like to back up latest files in another location. Any help would be appreciated.
    Thanks
    Jim

    OS?
    <br />
    <br />Lots of ways using a combo of LR and OS. Select an image and have it reveal where it lives. Note the location.
    <br />
    <br />Make a change in a file, export the metadata, and if a folder you are looking at doesn't have any recent changes, it isn't the one in LR.
    <br />
    <br />Another safeguard is to zip the file, instead of trashing it. That way if you later find LR is counting those images as part of its db, you can restore.
    <br />
    <br />
    <span style="color: rgb(102, 0, 204);"></span>
    <font br="" /></font> color="#600000" size="2"&gt;~~ John McWilliams
    <br />
    <br />
    <br />
    <br />MacBookPro 2 Ghz Intel Core Duo, G-5 Dual 1.8; Canon DSLRs

  • EXPDP generates new dmp file and reports "file already exists" error

    Hello everyone,
    Hope you all had a wonderful holiday. I got some problems with datapump expdp 10.2.0.4. It would be appreciated if you could provide some advice. Thanks in advance.
    I newly created a 10.2.0.4 database. The database can startup and be connected via Toad without problem. I can also use impdp to import some data to the new database. But when I'm trying to use expdp to export a schema from the database, I got the following errors:
    expdp parfile=expdp_scott_mfp1.parExport: Release 10.2.0.4.0 - 64bit Production on Monday, 26 December, 2011 22:10:49
    Copyright (c) 2003, 2007, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, Real Application Clusters, Data Mining and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31641: unable to create dump file "/u02/exports/mfp1/expdp_scott_mfp1_12262011.dmp"
    ORA-27038: created file already exists
    Additional information: 1
    Every time I run the expdp, it just creates the dmp file (expdp_scott_mfp1_12262011.dmp) specified in the parfile under the EXPORT directory and reports "file already exists " error.
    Your advice is highly appreciated.
    Thanks.
    Edited by: 904668 on Dec 27, 2011 8:47 AM

    i thought I found the problem. Used same file name on dump file and log file. How stupid of me. Sorry for bothering. Thanks and happy new year!

  • Question regarding material for Forms and Reports

    Hi, I'm good at SQL, now that i would like to learn Forms and Reports in Database. But i dont know like where to start and how to start. Could someone please help me out like what i need to do to learn them. As i said i'm very much new to these concepts.
    Thanks in advance

    Hello,
    if buying abook is an option for you and you are speaking german I would advise you to get
    Perry Pakull, Stefan Jüssen, Walter H. Müller:
    "Praktische Anwendungsentwicklung mit Oracle Forms", HANSER Verlag ISBN-10: 3-446-41098-8. Have a look at
    [http://www.hanser.de/buch.asp?isbn=978-3-446-41098-5&area=Computer|http://www.hanser.de/buch.asp?isbn=978-3-446-41098-5&area=Computer]
    Regards
    Mario

  • I have a 500 gb hard drive partitioned for files and time machine but when i connect it to my mac only the time machine one shows up

    well as the title says i have a 500gb hard drive with 2 partitions one with time machine and one for files it worked perefectly until yesterday only the time machine partition appeared in finder, i checked disk utility and both partitions appeared but the one for files appeared in grey and when i press mount an error comes up.

    The repair shop likely replaced a major circuit board on your MacBook Pro, so Time Machine thinks that you have a "new" computer and it wants to make a new complete backup of your Mac.
    You are going to have to make a decision to either add another new Time Capsule....or USB drive to your existing Time Capsule....and in effect start over with a new backup of your Mac and then move forward again.
    For "most" users, I think this is probably the best plan because you preserve all your old backups in case you need them at some point, and you start over again with a new Time Capsule so you have plenty of room for years of new backups.
    Or, as you have mentioned, you have the option of erasing the Time Capsule drive and starting all over again. The upside is that you start over and have plenty of room for new backups. The downside is that you lose years of backups.
    Another option....trying to manually delete old backups individually....is tricky business....and very time consuming. To get an idea of what is involved here, study this FAQ by Pondini, our resident Time Capsule and Time Machine expert on the Community Support area. In particular, study the pink box.
    http://web.me.com/pondini/Time_Machine/12.html
    Once you look through this, I think you may agree that this type of surgery is not for the faint of heart.  I would suggest that you consider this only if one of the other options just cannot work for you.

Maybe you are looking for

  • How do I access the next month shortcut key in Ical?, How do I access the next month shortcut key in Ical?

    I'm using mac lion, and somehow have it configured so that when I press Command+next arrow, the system navigates to the next desktop. I'd like to be able to access Next month from within ical. Is there a way to change the default keys in ical, to cha

  • HDMI Audio Output doesn't work in Windows on Retina MBP

    When I'm using Windows 7 (with BootCamp) on my Retina MacBook Pro, I like to connect it to my TV using the HDMI out to watch movies. However, when in Windows, it doesn't seem to output the audio to the TV. Instead, it plays through the MacBook's inte

  • Master-detatil - copy value from

    Hello. I have a problem with master-detail relationship. I have one block with hidden item (item is id - id is generated with the sequence). In the second block i have one item (not hidden) which is connected (relationship) to the id from the first b

  • 10.5.2 = Slow new MacBook Pro

    Hello, I recently installed the 10.5.2 update, and the Leopard Graphics Update 1.0, and my new MacBook Pro is more sluggish than before. i just purchased teh system last week, and I can certainly notice a speed difference after these updates. Applica

  • My Apple TV 2 has been stuck at the update screen for 18 hours, what can I do?

    I read about the 4.4.4 update yesterday. I always actively install new updates as soon as they're issued. This time my Apple TV has been stuck on step 1 of the update. The progress bar has been at about 90% since yesterday. Right below it says not to