Exchange: Database Leveling Redux

Some time ago I tackled the challenge of constructing a variant of the bin packing algorithm for leveling out Exchange databases’ size with the least amount of mailbox migrations necessary. Since then, I’ve been approached by a few people in dreadfully large environments looking for help with errors and compatibility issues around the script I released. I’ve finally rounded back to this script to do it some justice.

This was one of those scripts I initially put together as an intellectual exercise so I could stop thinking about it. I worked rather hard in late night hours logically constructing the process of what needed to be done for the algorithm. Once I got working results and a decent write up performed I breathed a sigh of relief that I could be free of the mental obsession and didn’t even look back at the quality of the script. This simultaneously made this one of the works I’ve been most proud and ashamed of.

There were a number of issues with the script I’ve either always known or have been made aware by others in the last year. Some of the notable ones are:

  1. Inability to run in Exchange 2010 environments
  2. Needing to be run directly in an Exchange session (thus possibly over utilizing resources)
  3. No calculation of disconnected mailboxes in database size
  4. Overall script complexity making it difficult to approach for many who might want to use it in their environment
  5. Some environments exhibited strange errors while processing mailboxes/databases

As with most of my work, I made a mental note to come back to it and re-release with some fixes should no other kind-hearted Powershell scripter decide to do so themselves. Of course no one has so here I am working on this thing yet again 🙂

To address the complexity (issue #4) I’ve wrapped the entire script with some parameters. You can still fine tune variables directly in the script but to keep things light and easy (and force a bit of usage rules) I’m only going to have two flags, SaveData and LoadData. Coincidentally, this will address points one and two as well. I’ve decided to divorce the information gathering portion from the processing portion of the script. I believe that issues with running the script I wrote in an exchange 2010 environment is largely due to the powershell version differences.

So for those running into pipeline errors and other such nonsense when running this script, please attempt to run this updated script with the -ExportData flag on your server, copy over the ExchangeData.csv file to your workstation (in the same folder as the script), then run the same script with the -ImportData flag. Optionally you can use the -verbose flag to see some more details fly by the screen.

For issue #5 I was not able to really zero in on a specific cause but in performing a code review I found that I was doing a few things that may lead to issues in specific environments. One such thing was not properly escaping strings for regular expression matching. So I was doing this:

When I should have been doing this:

Oh, and if you pay close attention to that last line you can find a pretty big mistake. What if there are no ignored databases? If $IGNORED_MAILBOXES = @() then this line will always match! So to be correct we need to make a regular expression that is positionally correct as well:

I also found some bizarre constructs I put together that I’d normally never release. For instance here I try to get all the unique databases but first select the property of the objects then try to filter them:

But  it makes more sense and is slightly less cumbersome to filter first then get the property. Actually, how about I filter at the mailbox information gathering portion and reduce that long line down to this instead?

While that works on severs and workstations which default to newer versions of powershell but if you run that on a windows 2008 R2 server you will likely see that $DBSet ends up with a $null value. So I finally had to land on using the following instead (slightly less ugly than the original but irritatingly long winded compared to the last line though):

These are just a few of the fixes I made in my code review. I also improved processing speeds a bit by reducing the overall number of mailboxes in the total processing set, improved output to page through long results, and cleaned out unused code among other things.


I’ve uploaded the script within the Microsoft Technet Gallary and have uploaded a copy to my Github repo as well.


If you really want to know the general idea and logic of the algorithm behind the script read my prior article on the matter. I go as far as to use some equations and even some diagrams in the write up and I consider it all very well thought out from an academic standpoint (at least no one has said otherwise).

I hope to get some feedback around this from you if you do end up using it. People reaching out to me with suggestions and stories of how they use my work in their environments is part of what keeps me releasing new and useful tools.

Comments (8)

  1. 12:43 PM, 08/16/2017Kurt P  / Reply

    Thank you very much for your script. It worked great. My environment is nowhere near as big as others that have commented here, only about 4500 mailboxes. I started with something similar to this:
    MBDB01 600000MB
    MBDB02 500000MB
    MBDB03 150000MB
    MBDB04 125000MB

    And we were getting alerts about database 1 filling the drive. Your script had me move approximately 900 mailboxes and now they look like this and the alerts have gone away:
    MBDB01 346000MB
    MBDB02 350000MB
    MBDB03 331000MB
    MBDB04 343000MB

    I did make a few modifications to the script because we had duplicate Display Names in our environment so I changed the script to use MailboxGuid instead. I also ran into some issues with mailbox quotas when moving some mailboxes. I modified the script to check if the mailbox was set to use the default quota and if so, was their mailbox size greater than or equal to the default quota. If it was then I had it add a Set-Mailbox command to increase the quota by 10MB to the ResultsFile so that it would increase the quota and then do the Move request. Thanks again for the script.

    • 8:30 PM, 08/17/2017Zachary Loeber  / Reply

      Wow! Thanks for the success story, I’m happy to hear two things, 1. That you got some value from the script and; 2. That you took it for what it was meant to be, ground work to make your own and customized it suit your environment and needs. Good job and thanks again!

  2. 10:36 AM, 11/10/2016Craig Thomas  / Reply

    Thanks for the great script. I am running the script with the switch using -loaddata. Does -loaddata create a file to run with the moves OR does it start the moves?

    • 12:29 PM, 11/23/2016Zachary Loeber  / Reply

      This script will not start any moves. All it does is create a script you can run to perform the mailbox moves. Loaddata is really meant to be run from your workstation so as to not bog down your server with all the processing it can take to generate the output (I recommend using -verbose as well to know it is actually doing something).

  3. 4:32 PM, 04/07/2015Eric W  / Reply

    Thanks for the updates! I am one of those readers with a big environment (100k mailboxes across 265 dbs). I have each of my customers with over 250 mailboxes on their own database(s). For example, I have a customer with approximately 3,000 mailboxes spread across 12 databases. So, to use your script, I am populating the ‘Ignored_Databases’ variable with an enormous list of every other db in my environment.

    Get-MailboxDatabase | ?{$ -notlike “NWSALES*”} | %{$str+= $”””,”””}
    $str = “”””+$str #remove extra ,” from end of line before pasting into script.

    I ran your script against the above customer and got the following results.


    Original database size information:

    Database Size
    ——– —-
    NWSALES-DB01 475342
    NWSALES-DB02 509082
    NWSALES-DB03 516723
    NWSALES-DB04 321023
    NWSALES-DB05 562332
    NWSALES-DB06 609833
    NWSALES-DB07 272923
    NWSALES-DB09 399224
    NWSALES-DB10 223076
    NWSALES-DB11 184377
    NWSALES-DB12 194846

    Future database size information:

    Database Size
    ——– —-
    NWSALES-DB01 388070
    NWSALES-DB02 388069
    NWSALES-DB03 388069
    NWSALES-DB04 388069
    NWSALES-DB05 388070
    NWSALES-DB06 388071
    NWSALES-DB07 388069
    NWSALES-DB09 388070
    NWSALES-DB10 388072
    NWSALES-DB11 388077
    NWSALES-DB12 388075

    Total Moves Required = 950


    That’s excellent, because DB06 is currently reporting that it’s low on log space and we’ll need to move some off there soon. This will free up a lot of white space on that db. 🙂 I am not sure if this client is in cached mode, so I won’t kick off moves until after hours tonight..

    • 9:41 PM, 04/07/2015Zachary Loeber  / Reply

      Excellent! I’m stoked to see some good results from the script. If you are going to move all those mailboxes then be certain you clear out any of the softdeleted mailbox copies that get left behind after the moves.

      • 2:01 PM, 04/08/2015Eric W  / Reply

        Pass 1: 172 moves still need to complete tonight after the backup runs and commits the log files.

        Running a mailbox report, I see my DBs currently are currently at:

        NWSALES-DB01 403567
        NWSALES-DB02 384504
        NWSALES-DB03 471327
        NWSALES-DB04 359593
        NWSALES-DB05 538828
        NWSALES-DB06 535706
        NWSALES-DB07 342349
        NWSALES-DB09 390372
        NWSALES-DB10 315308
        NWSALES-DB11 246327
        NWSALES-DB12 250973

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Pingbacks (0)

› No pingbacks yet.


Get every new post delivered to your Inbox

Join other followers