SBS 2003 Hardware Upgrade

In May of 2008, I wrote the article SBS 2003 Hardware Migration/Upgrade. Since then, I’ve been asked many times about my process for doing a SBS 2003 hardware upgrade. In general, I just have to say “I follow the steps”.

As I recommended in my original article, if the Microsoft white paper isn’t detailed enough for you, I recommend SBS MVP Jeff Middleton’s SBS Migration Tools.

However, this weekend I had the opportunity to upgrade the hardware for another client of mine, and I wrote down each step as I did it. This list of steps may encourage you to NOT do this. 🙂 There are many opportunities for error. This is a simple list of steps – if you don’t know what the shorthand means – you probably shouldn’t be doing it! 😛 Some of these one-line steps can consume quite a bit of time (e.g., “move wss” – the process is an entire white paper all on its own).

I execute this process in three phases. Phase I is basically “install the software on the new hardware.” Phase II is “configure the software and prepare for migration.” Phase III is “complete the migration.”

Phase I

-1] basic SBS 2003 RTM install
0] Join to AD domain
1] Dcpromo
2] Install DNS and DHCP
3] Change to AD integrated DNS
4] Update NIC(s)
5] move fsmo roles
6] update setup.sdb per http://theessentialexchange.com/blogs/michael/archive/2008/05/18/sbs-2003-hardware-migration-upgrade.aspx
7] Complete the SBS 2003 install
8] install Server 2003 sp1 (WindowsServer2003-KB889101-SP1-x86-ENU.exe)
9] Install kb 930045 (WindowsServer2003-KB930045-v5-x86-ENU.exe)
10] install WSS 2.0 sp1 (WSS2003SP1-kb841876-fullfile-enu.exe)
11] install Exchange 2003 sp1 (E3SP1ENG.exe)
12] install Windows XP sp2 for client deployment (SBS2003-KB891193-X86-ENU.EXE)
13] install SBS 2003 sp1 (SBS2003-KB885918-SP1-X86-ENU.EXE)
14] install Server 2003 sp2 (WindowsServer2003-KB914961-SP2-x86-ENU.exe)
15] install Exchange 2003 sp2 (E3SP2ENG.EXE)
16] Install kb 943494 (WindowsServer2003-KB943494-v4-x86-ENU.exe)
17] Install kb 930045 (WindowsServer2003-KB930045-v5-x86-ENU.exe)
18] install Resource Kit Tools (rktools.exe)
19] install Server Support Tools (suptools.msi)
20] install kb 943545 (WindowsServer2003-KB943545-x86-ENU.exe)
21] install Windows-Update/Microsoft-Update patches
22] install OpenManage/Server management tools
23] install FileserverTweaks.reg
24] move fsmo back (only domain fsmo, leave forest alone)

Phase II

1] configure remote access (ie, vpn)
2] activate server
3] add client licenses
4] configure SBS monitoring
5] configure exchange server
Copy Database Size limit – http://technet.microsoft.com/en-us/library/aa998066.aspx
Verify server property configurations
Change Offline Address List server for all Offline Address Lists
Update Recipient Update Service Config
Copy SMTP Connector / Default SMTP Virtual Server
Replicate public folders
6] install AV Server
7] install AV Client
8] move ssl certs
9] verify RPC/HTTPs config
10] set up new backup
11] move printers
12] move shares
13] initial file copy

On old server:
1] Install kb 943494 (WindowsServer2003-KB943494-v4-x86-ENU.exe)
2] Install kb 930045 (WindowsServer2003-KB930045-v5-x86-ENU.exe)
3] install kb 943545 (WindowsServer2003-KB943545-x86-ENU.exe)

Phase III

1] move fsmo
2] move mailboxes
3] move DHCP
4] move a/v clients
5] move WSS/SharePoint
6] move websites
7] move wsus
8] check/move Scheduled Tasks
9] move files
10] cut-overUntil next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

Exchange 2007 and Windows 2008: Offline Exchange Backup

In my article Getting a List of Stores in a PowerShell Script you learned how to obtain a list of all the files involved for the Exchange database stores on a particular Exchange server. In the preceding article, Getting a List of Storage Groups in a PowerShell Script, you learned how to obtain a list of all the files unique to the Exchange storage groups on a particular Exchange server.

As a part of both of those articles, you learned how to create a list of the volumes used by the files in the storage groups and in the database stores.

Now that we have that information, what can we do with it?

Easy! We can generate a script that can create an offline backup of our Exchange databases. In future articles, you’ll learn how to turn this offline backup into an online backup, using VSS (the Volume Shadow Copy Service).

As a quick reminder, the following global objects are important and were introduced in the earlier articles of this series:

$volumes – a hash array containing the disk volumes used by the Exchange storage groups and database stores

$pathPattern – a hash array containing a list of all the regular-expression patterns required to back up all the files involved for all Exchange storage groups and database stores

getStores – a function populating $volumes and $pathPattern for the files used by Exchange database stores

getStorageGroups – a function populating $volumes and $pathPattern for the files used by Exchange storage groups

validateArrays – a function verifying that the $volumes and $pathPattern arrays are not empty; the function returns zero if the main program should proceed.

A utility that we have not previously discussed is robocopy. Robocopy was introduced as a part of the Windows 2000 Server Resource Kit. Among other features, it copies large files as quickly as possible, much more quickly than the cmd.exe copy and xcopy. Robocopy is a standard utility in Windows Vista and Windows Server 2008.

Two additional global variables need to be introduced:

$destination – the directory below which backups will be stored; in the case of Exchange backups, the directory structure is reproduced identically. For example, if $destination is “C:\Backups” and the file “C:\Program Files\Microsoft\Exchange Server\Mailbox\First Storage Group\E00.CHK” is a file to be backed up, then the destination file name will be “C:\Backups\Program Files\Microsoft\Exchange Server\Mailbox\First Storage Group\E00.CHK”.

$nl – a DOS newline

So, after all the preparation we’ve already done, an offline backup script actually turns out to be quite simple. The PowerShell script below will generate a DOS script to be executed by cmd.exe. Robocopy will copy all relevant files to the backup location specified by $destination. If all copies succeed then the script succeeds. If any copy fails, the script aborts.

A couple of things to be careful of – this script actually executes the backup! To do that, in an offline mode, the Microsoft Exchange Information Store service must be stopped. When that service is stopped, Exchange is basically down. So…don’t run this on your production system without commenting out the line that starts with cmd.exe (unless you are actually doing the offline backup).

Secondly, offline backups do not purge transaction logfiles. We’ll need to learn how to do an online backup before we can make that happen.

Finally, the out-file cmdlet uses an unusual parameter: “-encoding ascii”. This is because cmd.exe does not understand Unicode files (which is the default for out-file). Something to remember for your own scripts!

  
    $destination = "C:\backups"

    $nl = "`r`n"

    function buildRobocopyString($collection)
    {

        $str = ""

        foreach ($filepath in $collection)
        {
            $file = split-path $filepath -leaf
            $path = split-path $filepath -parent
            #
            # the destination path is the source path appended to
            # the backup folder location.
            #
            $destpath = join-path $destination $path.SubString(3, $path.Length - 3)

            $str += "echo Copying " + $file + "..." + $nl
            $str += "robocopy " + '"' + $path + '" "' + $destpath + 
                '" "' + $file + '" /copyall /ZB >nul' + $nl
            $str += "if not errorlevel 0 goto :abort" + $nl
        }

        return $str
    }

    function buildCMD
    {
        $script = "@echo off" + $nl

        $script += 'net stop "Microsoft Exchange Information Store" /y' + $nl

        $script += buildRobocopyString $pathPattern.keys
	$script += $nl
        $script += 'net start "Microsoft Exchange Information Store"' + $nl
        $script += "exit 0" + $nl
        $script += ":abort" + $nl
        $script += "exit 1" + $nl

        $script | out-file (join-path (gc env:temp) "offline-backup.cmd") -encoding ascii
        cmd.exe /c (join-path (gc env:temp) "offline-backup.cmd")
    }

    #
    # Main
    #

    if ((getStorageGroups) -eq 0)
    {
        getStores
        if ((validateArrays) -eq 0)
        {
            buildCMD
        }
    }

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

Getting a List of Stores in a PowerShell Script

In my last post, Getting a List of Storage Groups in a PowerShell Script, you saw how to use the information from the Get-StorageGroup cmdlet to discover the particular disk volumes used by a storage group and to build a list of the files that were used by the storage group.

Now, that list of files did not contain the databases contained within the storage group. Instead, it simply contained the system file (Exx.CHK) for each storage group and the log files (Exx*.LOG) for each storage group. This is because the Get-StorageGroup cmdlet does not return us that information. Instead, we use the Get-MailboxDatabase and Get-PublicFolderDatabase cmdlets to find out the name of our databases. It’s unfortunate, but there is not a single Get-ExchangeDatabase cmdlet that combines the functionality of both,

Note that in Exchange Server 2007, each database consists of a single file, with an extension of EDB. In Exchange Server 2003, there was also a second file per database called the streaming file with an extension of STM.

To remind you of the global variables being used:

  • $computername is a string that contains the name of the computer on which the script is being executed.
  • $volumes is a hash array that contains a list of all the disk volumes so far detected.
  • $pathpattern is a hash array that contains a list of all the fully-qualified paths to all files so far discovered.

A store is always a simple filename. However, it’s important to remember that all of the file names that are used for naming stores and storage groups in Exchange Server may contain spaces and parentheses. That can lead to a requirement for special handling of file names.

When interrogating the list of stores on a particular server, any stores present in the Recovery Storage Group are also listed. These should be ignored.

Each cmdlet we use will return a collection. Get-MailboxDatabase will return a collection of all mailbox databases present on the given server. This collection could be empty even if storage groups are present. Get-PublicFolderDatabase will return a collection of all public folder databases present on the given server. This collection could also be empty (and, in fact, is more likely to be empty). The PowerShell code needs to be prepared to handle those eventualities.

In the PowerShell code below, the getStores function obtains all the stores on the server and adds the store filenames to the $pathpattern array and adds the disk volumes used by the stores to the $volumes array. Following getStores is the validateArrays function. If either the $volumes or the $pathpattern array is empty, it returns a value of 1 and displays a message on the PowerShell host. If both have contents, validateArrays returns a value of 0 and displays the contents of those arrays.

Without further ado:

function getStores
{
	## locate the databases, both mailbox and public folder

	$colMB = get-MailboxDatabase -server $computername
	$colPF = get-PublicFolderDatabase -server $computername

	## parse them for volumes too

	foreach ($mdb in $colMB)
	{
		if ($mdb.Recovery)
		{
			write-host ("Skipping RECOVERY MDB " + $mdb.Name)
			continue
		}
		write-host ($mdb.Name + "`t " + $mdb.Guid)
		write-host ("`t" + $mdb.EdbFilePath)
		write-host " "

		$pathPattern.($mdb.EdbFilePath) = 1

		$vol = $mdb.EdbFilePath.ToString().SubString(0, 1)
		$volumes.$vol += 1
	}

	foreach ($mdb in $colPF)
	{
		## a PF db can never be in a recovery storage group
		## which is why the Recovery check isn't done here

		write-host ($mdb.Name + "`t " + $mdb.Guid)
		write-host ("`t" + $mdb.EdbFilePath)
		write-host " "

		$pathPattern.($mdb.EdbFilePath) = 1

		$vol = $mdb.EdbFilePath.ToString().SubString(0, 1)
		$volumes.$vol += 1
	}

	return
}

function validateArrays
{
	$drives = $volumes.keys
	if ($drives.Count -lt 1)
	{
		write-host "No disk volumes were found. Aborting."
		return 1
	}

	write-host ("There were " + $drives.Count.ToString() + " disk volumes for Exchange server $computername. They are:")
	foreach ($drive in $drives)
	{
		write-host "`t$drive"
	}

	write-host " "

        $paths = $pathPattern.keys
        if ($paths.Count -lt 1)
        {
                write-host "No paths were found. Aborting."
                return 1
        }

	write-host ("There are " + $pathPattern.Count.ToString() + " directories to be backed up. They are:")
	foreach ($directory in $pathPattern.keys)
	{
		write-host "`t$directory"
	}
	write-host " "

	return 0
}

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

Getting a List of Storagegroups in a PowerShell Script

In Getting Our Computername in a PowerShell Script, you learned how to do just that – store the running computer name (the short NetBIOS name) into a PowerShell variable. At the same time, you learned WHY and HOW it worked. From this point forward, I’ll presume that you’ve executed this particular PowerShell statement:

$global:computername = (gc env:computername)

Note that this causes the storage scope of the $computername variable to be global. It can be interrogated from any function as just $computername (as long as there are no variables that have the same name in a closer scope [such as local scope or function scope]), but for the global value to be updated requires the inclusion of the “global:” specifier.

Thankfully, when we install the Exchange Tools on a workstation or server, one of the Exchange cmdlets is get-storagegroup. To see a list of all the storage groups on all the (Exchange 2007+) servers in your Exchange organization, you just enter get-storagegroup at the prompt in an Exchange Management Shell (EMS).

So, are we done? Nope.

While that list is probably all we need if we are working from a command prompt, if we are writing a script, we are probably interested in more than just the displayed information. For example, consider these requirements of the things we may want to know:

  • where the system files for the storage groups are placed
  • where the log files for the storage groups are placed
  • the log file prefix for the storage groups
  • a list of all the disk volumes used by the storage groups

Using get-storagegroup plus a feature of PowerShell makes all of this pretty easy. The feature we will use is called an associative array. Also known as a hash array, for those of you with a Perl background. An associative array is basically a two-dimensional array that is mapped to a single dimensional array based on the index element. Unlike a normal array, an associative array can be indexed by almost anything – a string, a regular expression, an integer – anything you may want to use. In order to pull off this “magic”, an associative array actually consists of two parallel arrays – the keys array and the values array. Now, these arrays get indexed as you are used to – by integers, starting at zero. However, their contents may consist of any object.

Note: in Perl, associative arrays aren’t quite as flexible. In PowerShell, you can literally have an object as a key and an object as a value. This allows you the flexibility of storing arbitrarily complex values into an associative array. However, the search through the keys array is linear. Thus, you should probably keep the arrays fairly small.

While in the EMS, you may get the impression that get-storagegroup returns text, that would be incorrect. Instead, get-storagegroup returns a collection (which is a fancy word for an enumerable array) of objects of type Microsoft.Exchange.Data.Directory.SystemConfiguration.StorageGroup. That’s a mouthful. Let’s just say that it returns an array containing all of the storage group objects.

So, to process that array, you would do something like this:

$collection = get-storagegroup
foreach ($entry in $collection)
{
        ... do something with $entry ...
}

However, you are more likely to want to look at the storage groups on a particular server. So, that leads you to a construct like this:

$collection = get-storagegroup -server $computername
foreach ($entry in $collection)
{
        ... do something with $entry ...
}

Note the tie-in back to $computername! This will access the global $computername variable which has the name of the running computer stored within it.

Now, what does a storage group consist of? It really has only two things:

  • log files
  • system files

And each of those two things also has a prefix associated with them (which allows multiple storage groups to share a single directory). All log files have a particular format:

.LOG

All system files (there is only a single system file per storage group) have a particular format:

.CHK

A prefix follows this format:

[E ] | [R00]

In words, a prefix is either R00, which is exclusively for the Recovery Storage Group, of which there may be only one per server; or the letter ‘E’ followed by a two-digit number. The number represents the index of the storage group on the server, where the first storage group created is ’00’ the second is ’01’, etc. So the first storage group created on a server will have a prefix of E00.

For that same first storage group, the log files have a format of E00*.log and the system file is named E00.chk.

Note: the number of digits contained in went from five hexidecimal digits in Exchange 2003 (and all earlier versions) to eight hexidecimal digits in Exchange 2007+. Even with the change in log file size from 5 MB to 1 MB, this means that you get 80+ times as many log files before rollover in Exchange 2007+ as you did in earlier versions of Exchange Server.

Let’s see…what else? Oh yes – you want to ignore recovery storage groups. Except for recoveries from backup, you are not supposed to touch them.

Now, given all those above details, what can we do with them? This!

## $volumes will contain the volume letters used by all named
## files and directories.

$global:volumes = @{}

## any storage group will contain:
## a] a system file directory
## b] a log file directory
## c] a filename for each database within the SG
##
## $pathPattern contains the dos patterns of files in the storage group

$global:pathpattern = @{}		### Exx.chk, Exx*.log, *.edb

function getStorageGroups
{
        $count = 0
	#
	# locate the storage groups and their log files and system files
	#
	$colSG = get-StorageGroup -server $computername
	if ($colSG.Count -lt 1)
	{
		write-host "No storage groups found on server $computername"
		return 1
	}

	## parse the pathnames for each SG to determine what
	## volumes it stores data upon and what directories are used

	foreach ($sg in $colSG)
	{
		if ($sg.Recovery)
		{
			write-host ("Skipping RECOVERY STORAGE GROUP " + $sg.Name)
			continue
		}

                $count++

		$prefix  = $sg.LogFilePrefix
		$logpath = $sg.LogFolderPath.ToString()
		$syspath = $sg.SystemFolderPath.ToString()

		write-host $sg.Name.ToString() "`t" $sg.Guid.ToString()
		write-host "`tLog prefix:      $prefix"
		write-host "`tLog file path:   $logpath"
		write-host "`tSystem path:     $syspath"

		## E00*.log
		$pathpattern.(join-path $logpath ($prefix + "*.log")) = 1

		$vol = $logpath.SubString(0, 1)
		$volumes.$vol += 1

		## E00.chk
		$pathpattern.(join-path $syspath ($prefix + ".chk")) = 1

		$vol = $syspath.SubString(0, 1)
		$volumes.$vol += 1

		write-host " "
	}

	if ($count -lt 1)
	{
		write-host "No storage groups found on server $computername"
		return 1
	}

	return 0
}

This routine stores, for each storage group, the files that are contained within that storage group. It also stores away the disk volumes used by that storage group. For the write-host output of the function, you could surround the blocks by $debug conditional statements to minimize the output of the routine (or just remove them entirely).

So, what is contained within storage groups? Databases! In our next post in this series, you’ll learn how to deal with them programatically too.

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

userAccountControl manipulation

The userAccountControl attribute, which resides on each user and computer object in an Active Directory forest, is responsible for, well, controlling lots of things about those accounts. For example it controls whether an account is locked out, or whether an account is disabled, or whether the password for the account expires.

The feature named User Account Control, introduced in Windows Vista, has nothing to do with the userAccountControl attribute. The naming collision is unfortunate.

The userAccountControl attribute is a bit-field attribute. This means that while many things are controlled by a single attribute value, each unique value can have an impact on an account. For information about all the possible values that the attribute can take, see KB 305144, How to use the UserAccountControl flags to manipulate user account properties.

In a forum I spend time with, a poster wanted to disable the “password never expires” flag on all the user accounts contained within an OU. Of course, you can do this manually, but that is subject to error and is very tedious. So, I provided them a PowerShell script to accomplish their objective. See below, and be aware that you can use the same techniques shown in this script to modify any bit-wise value.

You’ll note the use of “-band” and “-bxor” in the PowerShell script. These stand for “bit-wise AND” and similarly “bit-wise XOR”, respectively. The bit-wise operators ensure that each bit of a value is calculated against each corresponding bit in the paired value.

	$ou = "LDAP://cn=Users,dc=essential,dc=local"

	$ADS_UF_DONT_EXPIRE_PASSWD = 0x010000

	$objDomain = New-Object System.DirectoryServices.DirectoryEntry( $ou )
	$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
	$objSearcher.SearchRoot = $objDomain
	$objSearcher.Filter = "(&(objectCategory=person)(objectClass=user))"
	$results = $objSearcher.FindAll()

	foreach( $result in $results )
	{
		$user = [adsi] $result.Path
		$value = $user.userAccountControl.Item( 0 )

		( $user.Name.item( 0 ) + ' ' + $value.ToString() )

		if( ( $value -band $ADS_UF_DONT_EXPIRE_PASSWD ) -ne 0 )
		{
			$value = $value -bxor $ADS_UF_DONT_EXPIRE_PASSWD
			$user.userAccountControl = $value
			$user.SetInfo()
			( "`t" + $user.name + ' updated to $value' )
		}
	}

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

The Experts Conference 2009 (TEC’09)

Earlier this year, I wrote about presenting at Connections’08 in this article. That went so well, I’ve also applied (and been accepted) to speak at another conference – The Experts Conference (TEC).

Up through 2008, TEC was known as DEC (the Directory Experts Conference) and it focused exclusively on Active Directory (AD) and Identity Management (IdM). It was definitely the conference to go to for gurus in AD and IdM and it has a huge representation from the Microsoft Directory Services Product Team.

Beginning in 2009, DEC is adding another track on Exchange Server. Originally, there was some discussion about running two separate conferences side-by-side: DEC and GEEC (Great Experts Exchange Conference), but eventually the decision was made to rename the conference and run two tracks: one on Directory Services and one on Exchange Server.

I am privileged to be one of the Exchange Server speakers at the inauguration of TEC. Originally presented by NetPro (for seven years) and now presented by Quest (who purchased NetPro in the second half of 2008), the conference is still being organized and hosted by Gil Kirkpatrick (a long-time Directory Services MVP).

The American presentation of the conference is in Las Vegas, NV; March 22 – 25, 2009. The European presentation of the conference is in Berlin, Germany; September 14 – 16, 2009.

For the American Exchange Server agenda, please click here.

For the bios of all the Exchange Server speakers (including me!), please click here.

For the abstracts of all the Exchange Server sessions, please click here.

It is the goal of TEC’2009 for the Exchange Server presentations to be as high quality and as technical as those they have always had for Directory Services. Thus, if you want in-depth Exchange Server knowledge from some true experts – TEC’2009 is the place to be. In tight economic times, you have to carefully pick and choose which conferences and technical events you want to attend. TEC’2009 should be your #1 choice. As you can see from the American agenda and biographies, there is a very strong representation from the Exchange Server product team as well as from the Directory Services product team.

I hope that you will join me at TEC’2009. It is a worthwhile investment for you and your company.

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

The Final Step to Resolving Reboot Hangs

I’ve reported on a number of occaisions about attempting reboots and those reboots just “hanging” until the server was power-cycled (or until you could execute a “shutdown” command from another computer in the environment – not always easy when connecting remotely!).

If you want to read those articles, you can find them here, here, and here.

It has appeared that the Scalable Networking Pack (SNP) had a role to play, and that Small Business Server (SBS) may have gotten the worst end of this stick, but it apparently turns out to have been a race condition in NTFS driver shutdown code.

Microsoft has released a number of patches over the last year to address this, but I can say that I’m finally happy with the last iteration of the patch. You can find that patch in this KB article: A Windows Server 2003-based computer stops responding when you shut down the computer in a remote console session.

I certainly won’t promise you that it solves all of the issues – but I’ve not seen a hang since I installed the last version of this patch. A version of the hotfix is available for both Windows Server 2003 sp1 and sp2.

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

Getting Our Computername in a PowerShell Script

There are lots of reasons why you may want to get the name of your computer while you are executing a PowerShell script, from a parameter you want to pass to a cmdlet or function or for generating a filename.

Thankfully, PowerShell makes it easy.

Just like our old friend, cmd.exe, PowerShell makes the system-level environment variables available. Unlike cmd.exe, PowerShell doesn’t make them available by surrounding the environment variable by a percentage symbols. Instead, PowerShell uses a provider. This provider makes Environment variables look like a file system and allows you to access (or set) their contents by reading (or writing) that content.

For example, to examine all environment variables and see their contents in cmd.exe, you enter:

set

To get the equivalent output in PowerShell (including the variables being sorted by name), you enter:

dir env: | sort Name

In cmd.exe, to display the value of COMPUTERNAME, you enter:

echo %COMPUTERNAME%

In PowerShell, you enter:

gc env:computername

Or the long form of:

get-content env:computername

And to store the value of the computername into a variable in PowerShell:

$computer = gc env:computername

We’ll use this in my next post.

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch

Exchange Connections – Fall 2008

Next week, in Las Vegas, Nevada is the semiannual Connections conference. The Connections conference is a technical conference covering SQL Connections, Windows Connections, Exchange Connections, etc. There are lots of individual tracks, both for IT Pros and Devs.

I’ll be speaking next week at the conference, delivering three Exchange presentations:

EXC10: Exchange 2007 and Windows 2008: Backups the Easy Way (75 minutes)
In this presentation I’ll show you how to use the native Windows tools present in Windows 2008 to make Exchange 2007 backups AND to restore them. I’ll cover some theory, some philosophy, and lots of PowerShell.

EXC11: SMB Exchange Operations (60 minutes)
In this presentation I’ll discuss so key factors of Exchange day-to-day operations that affect the Small Business

EXC12: Building an Exchange Test Environment in a Hurry (75 minutes)
In this presentation I’ll discuss some of ways in which you can quickly generate a virtualized Exchange test environment. After all, all the time you spend building, is less time you can spend testing.

You can see the Event Schedule here and general conference information here.

Please come say “hi”. Even better – attend my presentations!

Until next time…

As always, if there are items you would like me to talk about, please drop me a line and let me know!


Follow me on twitter: @EssentialExch