Splitting up a job for 140k+ users across multiple servers

Hello, 
I am pretty new to Powershell and want to learn more about scaling stuff and just started working with jobs.
In this particular case I am just doing mass enable or disable at a per user level.  The other script I need to do this with grabs and checks values on around 6000 distribution groups and using the current values and type it creates new commands
to add/remove certain users or permissions in bulk with Invoke-Expression.  I *think* it would probably be best in my case to run these across servers as well.
Basically what I am looking at is:
Using one large list/array, counting it, splitting it, using the resources it has available with jobs.
One of the problems I have had with this but seems I have mostly figured out is how I combine or 'foreach' several different values that may need to be applied to separate objects on certain servers with certain users and certain attributes. 
Last night I ran the first script that could do that but it took me awhile and looks like a wreck I am sure - but it worked!
Now to tackle size.
Thank You

Hi Paul,
looking good so far. Did a little rewrite of what you posted:
Function Disable-Stuff
Param (
[Parameter(Position = 0, Mandatory = $true)]
[string]
$file,
[Parameter(Position = 1)]
[ValidateSet('CAS', 'MBX', 'ALL')]
[string]
$servertype = "CAS"
# Collect server lists
$servers = @()
switch ($servertype)
"CAS" { $servers += Get-ClientAccessServer | Select -ExpandProperty name }
"MBX" { $servers += Get-MailboxServer | select -ExpandProperty name }
"ALL"
$servers += Get-ClientAccessServer | Select -ExpandProperty name
$servers += Get-MailboxServer | select -ExpandProperty name
# Remove duplicate names (just in case)
$servers = $servers | Select -Unique
default { }
# Calculate set of operations per server
$boxes = ($servers).count
$content = Get-Content $file
$split = [Math]::Round(($content.count / $boxes)) + 1
# Create index counter
$int = 0
# Split up task
Get-Content $filepath -ReadCount $split | ForEach {
# Store file content in variable
$List = $_
# Select Server who does the doing
$Server = $servers[$int]
# Increment Index so the next set of objects uses the next Server
$int++
# Do something amazing
# ... <-- Content goes here
Disable-Stuff "c:\job\disable.txt" "CAS"
Notable changes:
Removed the test variables out of the function and added them as parameters
Modified the Parameters a bit:
- $file now is mandatory (the function simply will not run without it)
- The first parameter will be interpreted as the file path
- The second parameter will be interpreted as Servertype
- $Servertype can only be CAS, MBX or ALL. No other values accepted
- $Servertype will be set to CAS unless another servertype is specified
you if/ifelse/else construct has been replaced with a switch (I vastly prefer them but they do the same functionally
I removed the unnecessary temporary storage variables.
Appended a placeholder scriptblock at the end that shows you how to iterate over each set of items and select a new server each time.
I hope this helps you in your quest to conquer Powershell :)
Cheers,
Fred
There's no place like 127.0.0.1

Similar Messages

  • 'BBPSC11' error in Monitor SC for one User having multiple positions but on

    Hello,
    'BBPSC11' error in Monitor SC for one User - having multiple positions in org structure - but having one BP code associated to all positions.
    We have one BP ID associated to multiple positions of the same user - in multiple org structure.
    The org unit is refered as one Project and like wise we have multiple projects people worked on.
    Once the Proj is over we move the Users from one Proj (Org unit) to another Proj, with new Position created copying the old and associate old BP code to it.
    With this when we go for Monitor SC option - enter User ID in Created By field - old SC are listed but we are getting error if we click on the Detail icon.
    Error:The Internet Transaction Server could not start the transaction "BBPSC11" because of the following error: Attribute for user contains errors. Inform systemadmin. .
    AD

    Hi,
    Pl. verify the user with txn-bbp_attr_check. It could be that the org. relationship of the user changed with what was captured on shopping cart. Also use txn-users_gen to repair the user.
    Regards,
    Sanjeev

  • How to identify a user across multiple pages

    Hi,
    I'm doing a homebanking and I would like to know how to identify a user across multiple pages.
    I have already take a look at HTTPSESSION, but I didn't understand.
    Can someone help me.
    I'm send the servlet Logon.
    import java.io.*;
    import java.sql.*;
    import java.util.Date;
    import java.util.*;
    import javax.servlet.*;
    import javax.servlet.http.*;
    public class Cons_logon extends HttpServlet
         private Connection conexao = null;
         Login1 login1;
         public void init (ServletConfig cfg) throws ServletException
              super.init(cfg);
              try
                   Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
                   conexao = DriverManager.getConnection("jdbc:odbc:bank");
              catch (Exception e)
                   System.out.println(e.getMessage());
         public void doPost (HttpServletRequest req,
    HttpServletResponse res)
    throws ServletException, IOException
              String Suser, Spassword;
         PrintWriter out;
              res.setContentType("text/html");
    out = res.getWriter();
    String opcao = req.getParameter("log");
    Thanks

    I would recommend using the authentication mechanism that's guaranteed by the servlet spec. If you do that, you can just call
    request.getRemoteUser()
    to get the user name across multiple pages.
    If you want to use your own login scheme, you can create a new session object and map it to a user name somewhere in your app. Or you can just put the name of the user on the session. But the preferred way is to use the default authentication scheme defined by the spec.

  • Capture performance metrics across multiple servers

    Hello. I'm still very new to Powershell but anyone know of a good Powershell v.3 -4 script that can capture performance metrics across multiple servers with an emphasis on HPC (high performance computing) and gen up a helpful report, perhaps in HTML or Excel
    format?
    Closest thing I've found and used is this line of powershell:
    http://www.microsoftpro.nl/2013/11/21/powershell-performance-monitor-on-multiple-remote-computers/
    Maybe figure out a way to present that in better format, such as HTML or Excel.
    Also, if someone can suggest some performance metrics to look at with an HPC perspective. For example, if a CPU is running at 100 utilization, figure out if which cores are running high, see how many threads are queued waiting for CPU time, etc...

    As far as formatting is concerned,
    ConvertTo-HTML is a basic HTML output format, but you can spice it up as much as you like:
    http://technet.microsoft.com/en-us/library/ff730936.aspx
    Out-Grid is very functional and pretty simple:
    http://powertoe.wordpress.com/2011/09/19/out-gridview-now-has-a-passthru-parameter/
    Here's an example with Excel:
    Excel
    Worksheets Example
    This might be a good reference for HPC, I don't have access to an HPC environment so I can't offer much advice there.
    http://technet.microsoft.com/en-us/library/ff950195.aspx
    It might be better to keep unrelated questions separate, so a thread doesn't focus on one question and you lose time getting an answer to another.
    I hope this post has helped!

  • Using ATMI and tuxedo for distrubuted transactions across multiple DBs

              I am creating the framework for a given application that needs to ensure that data
              integrity is maintained spanning multiple databases not necessarily within an
              instance of weblogic. In other words, I need to basically have 2 phase commit
              "internet transactions" between a given coordinator and n participants without
              having any real knowlegde of their internal system.
              Originally I was thinking of using Weblogic but it appears that I may need to
              have all my particular data stores registered with my weblogic instance. This
              cannot be the case as I will not have access to that information for the other
              participating sytems.
              I next thought I would write my own TP...ouch. Everytime I get through another
              iteration I kept hitting the same issue of falling into an infinite loop trying
              to ensure that my coordinator and the set of participants were each able to perform
              the directed action.
              My next attempt has led me to the world of ATMI. Would ATMI be able to help me
              here. Granted I am using JAVA so I am assuming that I would have to use CORBA
              to make the calls but will ATMI enable me to truly manage and create distributed
              transactions across multiple databases. Please, any advice at all would be greatly
              appreciated.
              Thanks
              Chris
              

              I am creating the framework for a given application that needs to ensure that data
              integrity is maintained spanning multiple databases not necessarily within an
              instance of weblogic. In other words, I need to basically have 2 phase commit
              "internet transactions" between a given coordinator and n participants without
              having any real knowlegde of their internal system.
              Originally I was thinking of using Weblogic but it appears that I may need to
              have all my particular data stores registered with my weblogic instance. This
              cannot be the case as I will not have access to that information for the other
              participating sytems.
              I next thought I would write my own TP...ouch. Everytime I get through another
              iteration I kept hitting the same issue of falling into an infinite loop trying
              to ensure that my coordinator and the set of participants were each able to perform
              the directed action.
              My next attempt has led me to the world of ATMI. Would ATMI be able to help me
              here. Granted I am using JAVA so I am assuming that I would have to use CORBA
              to make the calls but will ATMI enable me to truly manage and create distributed
              transactions across multiple databases. Please, any advice at all would be greatly
              appreciated.
              Thanks
              Chris
              

  • Best practice for SSH access by a user across multiple Xserves?

    Hello.
    I have 3 Xserves and a Mac Mini server I'm working with and I need SSH access to all these machines. I have given myself access via SSH in Server Admin access settings and since all 4 servers are connected to an OD Master (one of the three Xserves), I'm able to SSH into all 4 machines using my username/password combination.
    What I'm unsure of though is, how do I deal with my home folder when accessing these machines? For example, currently, when I SSH into any of the machines, I get an error saying...
    CFPreferences: user home directory at /99 is unavailable. User domains will be volatile.
    It then asks for my password, which I enter, and then I get the following error...
    Could not chdir to home directory 99: No such file or directory
    And then it just dumps me into the root of the server I'm trying to connect to.
    How should I go about dealing with this? Since I don't have a local home directory on any of these servers, it has no where to put me. I tried enabling/using a network home folder, but I end up with the same issue. Since the volume/location designated as my home folder isn't mounted on the servers I'm trying to connect to (and since logging in via SSH doesn't auto-mount the share point like AFP would if I was actually logging into OS X via the GUI), it again says it can't find my home directory and dumps me into the root the server I've logged in to.
    If anyone could lend some advice on how to properly set this up, it would be much appreciated!
    Thanks,
    Kristin.

    Should logging in via SSH auto-mount the share point?
    Yes, of course, but only if you've set it up that way.
    What you need to do is designate one of the servers as being the repository of home directories. You do this by simply setting up an AFP sharepoint on that server (using Server Admin) and checking the 'enable user home directories' option.
    Then you go to Workgroup Manager and select your account. Under the Home tab you'll see the options for where this user's home directory is. It'll currently say 'None' (indicating a local home directory on each server). Just change this to select the recently-created sharepoint from above.
    Save the account and you're done. When you login each server will recognize that your home directory is stored on a network volume and will automatically mount that home directory for you.

  • How to delete the Background job for Deleted user

    Dear experts
    The User RAMESH  was delted before one month
    I dont know what job  had he Created and Scheduled
    where should i find the perticular user backgroung jobs and how to  delet it
    regards
    krishna

    From SM37....you can give the user name and find all jobs scheduled by the user
    select all jobs>>>>delete
    *This should have admin access on jobs
    Regards,
    Nick Loy

  • How to query user across multiple forest with AD powershell

    Hi Guys
      Our situation like this , we have two forest ,let say forestA.com and forestB.com, and they are many subdomian in forest A.
      I'd like to write a script to the AD object information via get-adobject -identify xxxx
      My accont belongs to forestA.com , and the computer i logged on belongs to forestB.com ,A & B have forest trust.
      Now the problem is if the object i quried belngs to forestB.com ,the Get-ADObject works fine ,however if the object belongs to forestA.com ,i got the error "Get-ADObject: Cannot find a object with identify: 'xxxx' under: 'DC=forestB,DC=com'.
      So how can i have a script than can query user in both forest

    Prepared this some time ago for a PowerShell Chalk & Talk. Just change the forest names and credentials. Each Active Directory cmdlet you are calling works on the current drive. So to switch between the forests you need just change the drive / location.
    This is also quite nice for migration scenarios.
    $forests = @{
    'forest1.net' = (New-Object pscredential('forest1\Administrator', ('Password1' | ConvertTo-SecureString -AsPlainText -Force)))
    'forest2.net' = (New-Object pscredential('forest2\Administrator', ('Password2' | ConvertTo-SecureString -AsPlainText -Force)))
    'forest3.net' = (New-Object pscredential('forest3\Administrator', ('Password3' | ConvertTo-SecureString -AsPlainText -Force)))
    'a.forest1.net' = (New-Object pscredential('a\Administrator', ('Password1' | ConvertTo-SecureString -AsPlainText -Force)))
    'b.forest1.net' = (New-Object pscredential('b\Administrator', ('Password1' | ConvertTo-SecureString -AsPlainText -Force)))
    Import-Module -Name ActiveDirectory
    $drives = $forests.Keys | ForEach-Object {
    $forestShortName = ($_ -split '\.')[0]
    $forestDN = (Get-ADRootDSE -Server $forestShortName).defaultNamingContext
    New-PSDrive -Name $forestShortName -Root $forestDN -PSProvider ActiveDirectory -Credential $forests.$_ -Server $forestShortName
    $result = $drives | ForEach-Object {
    Set-Location -Path "$($_):"
    Get-ADUser -Identity administrator
    $drives | Remove-PSDrive -Force
    $result
    -Raimund

  • Spreading users across multiple AP's

    Hi
    I have a requirement to provide Wireless to over 250 people in one big arena. I am thinking of using an Lwapp install but was wondering how will I force users to associate evenly across all access points. I want to avoid the situation whereby 30 associate with AP1 and 5 associate with AP2
    Any help would be appreciated

    Hi Martin,
    Hope so everything's fine at your end.
    Typically Ap1200's or LWAPP AP's manage load balancing internally on their own using the Auto-RF features if you are using some 4000 series controllers dunno about others. So this auto-rf features should equally load balance the client across all Ap's, well if that doesn't happen for som e reason then, you may change the configuration to do that.
    You may create seperate wlans to be broadcasted through different lwapps with seperate ssid's.
    For example,
    AP1200-1 broadcasts wlan 1 connecting around 35 clients
    AP1200-2 broadcasts wlan 2 connecting another set of 35 or more.. and so on.
    AP1200-3 broadcasts wlan 1 again but is at a distance of say more than 100 feet or so connecting another set of 35 or more.. and so on.
    Please feel free to contact me if you need any more assistance.
    Thanks & Regards,
    Karthik Narasimhan

  • Using JDBC Persistent Manager for Tomcat across multiple Servers

    Sorry if I word this question incorrectly, but I hope you can get the gist of it. Basically, I have a web application that will be running on 2 different machines. When too many users are on one machine, they are switched over to another machine (load balancing). If I use JDBC (or file) Persistance Manager to store sessions, what happens when a user is switched between machines? If both machines use the same settings for the Persistance Manager, when a user has just been moved to a new machine, will it be able to load the session from the DB? (since it should have its sessionID). If you can not do this, is there a way to do it?
    Thanks!

    Sorry if I word this question incorrectly, but I hope
    you can get the gist of it. Basically, I have a web
    application that will be running on 2 different
    machines. When too many users are on one machine,
    they are switched over to another machine (load
    balancing). If I use JDBC (or file) Persistance
    Manager to store sessions, what happens when a user is
    switched between machines? If both machines use the
    same settings for the Persistance Manager, when a user
    has just been moved to a new machine, will it be able
    to load the session from the DB? (since it should
    have its sessionID). If you can not do this, is
    there a way to do it?
    Thanks!Session ids are unique and cannot be transferred between different servers... Wise web servers supports load balancing and should be able to transfer/store session without your special code. If you still keen to write transfer code yourself you have to:
    1) Store not only session id, but also all content of session (including some virtual (your) session id
    2) Transfer (i.e. create a copy) stored session content to another machine
    Paul

  • Managing Users Across Numerous Servers

    Hey all, got a question.
    I have 6 servers that my company uses to host all of the media we use on a daily basis. We're tightening security around here, so I'm setting them up to allow each of our 80 users to admin their own passwords. Seems simple, yeah?
    Well, I first tried to set up an Open Directory server to do the user and password managing. It is/was a 10.5.2 Server. All of the servers pointing to it are 10.4.11. Worked okay when I first implemented it, but then crashed and burned horribly. Got some really cryptic errors on the 10.5 side, and it basically stopped working. Because we can't afford ANY downtime at my company, I scrapped it, and am doing it on a server level.
    I have the first server up and working. Everyone has a login, and everyone can admin their own passwords. So, this leaves me with five more servers to set up.
    I don't really want everyone to have to go through the same steps to changs their passwords for every one of those.
    I know that you can export user lists from the WGM, but it doesn't retain passwords.
    Any thoughts on how I should proceed?
    Thanks in advance!

    The "Password Server" you are looking for is there in the Open Directory Master.
    Set up the user accounts, including user-changeable passwords on the Open Directory Master, and Mac OS X Server software will do all the replication and updating to all the other Open Directory Replicas. You only need enter the User Info once, on the Open Directory Master, and it will be quickly and automatically replicated to all the Open Directory Replicas. The users can change their passwords and any other attributes at will, and all User info on all Servers are updated quickly. (With six servers, you can set it to be essentially instantly.)
    For this to work as expected, the User Accounts should be created with Workgroup Manager in a Network Accessible Shared Directory that has a Network Mount record. Then the Users are Network Users, rather than Local Users, and can log in from any Server or Workstation and use the resources their Username and Group allows them.
    Six servers could easily support login (but probably not the file volumes you are serving) for over a thousand Users. There is no way a person could keep that much info updated if the account info had to be manually replicated. For your network, you would be using features developed for Server networks with thousands of Users, to service your handful of users.
    This Open Directory Master/Replica setting only refers to Open Directory info. There is no need for any thing to be manually replicated on each Server. They can and should have their own unique information in other areas.
    Message was edited by: Grant Bennet-Alder

  • How do I search for common values across multiple columns?

    I am coordinating a schedule with 5 people across hundreds of dates, and have columns A-E filled with many rows of dates. How can I make a new column that displays all the dates (values) that each person (column) has in common with all the others?
    Is there a simple formula for this?
    thanks!

    Scarampella,
    A second table can be used to find your matching dates.
    Here's an example:
    The formula in Matching Dates is:
    =IF(ISERROR(MATCH(A,Table 1 :: A, 0)+MATCH(A,Table 1 :: B, 0)+MATCH(A,Table 1 :: C, 0)+MATCH(A,Table 1 :: D, 0)+MATCH(A,Table 1 :: E, 0)), "", A)
    Basically, I look for matches in each person's list of dates, and if any fail to produce a match with the date being examined, the result is a miss, and if all match, it's a hit. You can sort the result to get a short list of matches without spaces.
    Regards,
    Jerry

  • Searching for one song across multiple playlists in itunes

    I have many playlists that I've created in itunes. Some of them include the same songs. Is there a way or a particular third-party software that I can employ to search for a particular tune amongst all my playlists?
    For example, the song "Feelin' Alright" comes up in at least 4 of my 35 or so playlists, though I'm not sure which ones off hand. Is there a way I can keep track of "Feelin' Alright"; that is, do a search and have itunes give me those results (how many times, which playlists, which position in each playlist, etc)? Any help would be appreciated. Thanks.

    You might already know this, but if you highlight a song in iTunes and then right-click, a window will appear with options. Selecting the Show In Playlist option will show you the names of the playlists in wish the song appears ...

  • Server load balancing for application access using multiple servers

    1.what are the methods supported by cisco switches for load balancing
    2. I want to achive users to access 1 particular ip from different locations but phsically few servers which handle the application and data

    well some servers allow you to install routing protocols on them. you could OSPF some links together.
    or you could NLB if it is a microsoft server. this uses a heartbeat network, a virtual mac and an IP address bound to the vmac.
    you could use NIC teaming. broadcom nics on dell servers allow you to configure them for loadbalancing, failover and a few other options.
    or if the servers are mirrored using MSCS or something similar (i.e configured the same but independant) you could just load balance using DNS.
    hope this helps. jsut some ideas quickly off the top of my head

  • SFSB Instance Sharing across multiple servers/SFSB failover practices

    Hello:
    My question is in two parts. I have spent a great deal of time searching the forums without a satisfactory answer, so I thought I'd post my question directly.
    1) I have two clients that can potentially talk to two different servers, but they both need to interact with the same instance of a SFSB. For example, the first client calls the SFSB and causes it to save some state in its instance variables. The second client connects to a different server (because of a "network dispatcher" load-balancing architecture), but needs to use the same instance of the SFSB that the first client initialized. The two clients will not always talk to different servers, but the possibility exists that they might.
    2) What is the proper design pattern for "fail-over" for a SFSB. For example, a client establishes a session, tickles a SFSB and causes its instance variables to contain state, and then WHAMO the application server crashes or becomes unavailable for some reason. We have in place a mechanism to reroute further client request to a second application server, but currently it is a problem because even though the failover is transparent to the client, the backup server creates a new instance of the SFSB and therefore it has none of the previous state information.
    I'm thinking that the answer to one of these questions will be the answer to the other.
    A little bit of background: this is not a theoretical application. We are building 250+ cars per day, ramping up quickly to 500-650 cars per day. I'm hoping I can solve this problem from an architectural standpoint, without having to modify each individual bean, because we have on the order of 200-300 SFSBs that would have to be changed--not a pretty thing in a production critical application. We have two AIX servers, but one of them is currenlty just a hot standby because we cannot run both of them at the same time because of problem 1) from above. We'd like to be able to run both servers at the same time for load-balancing purposes. Furthermore, if one box fails then all SFSB data will be lost because of the problem is mentioned problem 2) above.
    (As a point of clarification, and only because I don't know if this affects any possible suggestions, but the clients do not use remote references to the EJBs. They simply pass "data containers" via HTTP to the server, and this data is passed to the various EJBs and returned back to the client via HTTP Response and in some cases TCP/IP. In either case, we do not use "remote object" references in the most typical sense. The servlet maintains the references to the EJBs.)
    I've read things about "session clustering" but have not pinned down the subject. I have the "Core J2EE Patterns" book on order.
    Any suggestions or pointers to reading materials would be greatly, greatly appreciated. I also welcome the "what you should have done" variety suggestions.
    Thanks!
    Regards,
    Doug Wilkerson

    Doug,
    Here are my thoughts, I hope they help.
    I don't think there is a possibility to have two different client (different sessions) talk to one and the same SFSB. A SFSB is specific to a users session and cannot be shared.
    The way I would tackle this problem is probably by using SLSB which might access either an entity bean or the DB directly. This way, you might cause DB overhead, but you can share the data between clustered servers and the users will work with the correct data.
    About the second question, I don't really now a pattern to provide fail-over for SFSB. To my opinion that is the weakest side of SFSB (beside all the load-balancing that needs to take place).
    Hope this helps.

Maybe you are looking for

  • IDXNOALE not working as expected

    Good day Gurus, I've made an entry in IDXNOALE and but when the message comes in for the sending system it is still requesting an acknowledgement. The sending system is setup as a business service and the message is a idoc flat file being picked up b

  • WLC Release OS and AP-SSO

    Hello to all, Which Release OS you suggest migrating from 7.0??? I've seen that 7.4 is quite an unpredictable Release with major problem so I'm evaluating 7.3 or 7.5... Regarding AP SSO If I wanto to keep the Redundancy model of 7.0 without AP-SSO I'

  • Aperature hanging up and won't open?

    Since the last software update my Aperature program hangs up and won't open, intermittently.  Who do I talk to about this?  I bought it through the App store on i Tunes.

  • How to link to a Topic's Bookmark from code

    Our developers program in Visual Objects, and when creating the F1 link between our applications and my CHM files, use the topic's htm file number (e.g. 1234.htm) to call the CHM file and open the appropriate topic. Is it possible for them to call th

  • How can you De-authorize PC's you no longer have and one that does not power up?

    Currently I'm using a Windows machine. I  had a Macbook Pro, iMac desktop and iPhone, and only have the iMac now. However, the iMac has not been used for a couple years or more and now will not even power up. I intend trying to get it fixed, but in t