Posts By Subject

Friday, April 17, 2015

Gotchas!: Running Sage 50 Accounting from Hyper-V VM

Good day to all. Recently, while working on the infrastructure of one of my clients, I was in the process of migrating their platform to Hyper-V. I needed to have the old server converted to a VM and this was easy to do using MVMC 3.1. Once the P2V was done, and after dealing with the boot issues, the VM loaded up fine and the path was clear to make the old server into a virtual host. However, upon later testing, it turned out that Sage 50 was running incredibly slow, to the point where it was exceedingly impaired. After tearing my hair out about this for about an hour, it turned out that the Broadcom NIC VM queue issue was rearing it's ugly head at me once again, which I found especially irritating since the driver version I had installed on the virtual host supposedly fixed this. Disable this setting in the Hyper-V console (or do the "proper" VMQ configuration if you have to use it) and all should be well.

Wednesday, March 4, 2015

Get model information for Dell PCs via Powershell

I was recently tasked to get inventory information of all of the project staff's computers. Since there wasn't any inventory software running in the desktop environment, I had to use good ole' sneakernet to get all of the service tags. Once I had them all, I promptly decided that there was no way I was manually typing all of this in and getting the required info from Dell. This lead me to stumble on a script posted here. The script is a modified version of an older one and it works very well. The best thing is that it makes exporting to CSV a cinch. However, the script doesn't allow one to get the model information. Unfortunately, I didn't write that down when I collected the service tags, so modifying the script is my only choice. After inspection of the script, it turns out that it uses a pipe to the Select-Object -ExpandProperty command to get the warranty information. However, Dell's server doesn't put the model in that particular object's property. The correct property to expand is AssetHeaderData, which will give you the model and ship date, among other things. After finishing my task, I figured I'd make it a whole lot easier for anyone else looking ot do the same thing and already have the required service tags. If you are trying to look up PCs via computer names, there are other scripts out there that do so via WMI or AD. This script only works on your PC or any service tags you provide it:

[CmdletBinding()]
param(
    [parameter(Mandatory=$false)]
    [string]$ServiceTag,
    [parameter(Mandatory=$false)]
    [string]$ComputerName,
    [parameter(Mandatory=$false)]
    [switch]$ExportCSV,
    [parameter(Mandatory=$false)]
    [string]$ImportFile
)

function Get-DellWarrantyInfo {
    [CmdletBinding()]
    param(
        [Parameter(Mandatory=$False,Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
        [alias("SerialNumber")]
        [string[]]$GetServiceTag
    )
    Process {
        if ($ServiceTag) {
            if ($ServiceTag.Length -ne 7) {
                Write-Warning "The specified service tag wasn't entered correctly"
                break
            }
        }
    $WebProxy = New-WebServiceProxy -Uri "http://xserv.dell.com/services/AssetService.asmx?WSDL" -UseDefaultCredential
    $WebProxy.Url = "http://xserv.dell.com/services/AssetService.asmx"
    $WarrantyInformation = $WebProxy.GetAssetInformation(([guid]::NewGuid()).Guid, "Dell Warranty", $GetServiceTag)
    $WarrantyInformation | Select-Object -ExpandProperty AssetHeaderData
    return $WarrantyInformation
    }
}

if ($ServiceTag) {
    if (($ComputerName) -OR ($ExportCSV) -OR ($ImportFile)) {
        Write-Warning "You can't combine the ServiceTag parameter with other parameters"
    }
    else {
        $WarrantyObject = Get-DellWarrantyInfo -GetServiceTag $ServiceTag | Select-Object @{Label="ServiceTag";Expression={$ServiceTag}},SystemModel,SystemShipDate
        $WarrantyObject[0,1] #Remove [0,1] to get everything
    }
}

if ($ComputerName) {
    if (($ServiceTag) -OR ($ExportCSV) -OR ($ImportFile)) {
        Write-Warning "You can't combine the ComputerName parameter with other parameters"
    }
    else {
        [string]$SerialNumber = (Get-WmiObject -Namespace "root\cimv2" -Class Win32_SystemEnclosure -ComputerName $ComputerName).SerialNumber
        $WarrantyObject = Get-DellWarrantyInfo -GetServiceTag $SerialNumber | Select-Object @{Label="ComputerName";Expression={$ComputerName}},SystemModel,SystemShipDate
        $WarrantyObject[0,1] #Remove [0,1] to get everything
    }
}

if (($ImportFile)) {
    if (($ServiceTag) -OR ($ComputerName)) {
        Write-Warning "You can't combine the ImportFile parameter with ServiceTag or ComputerName"
    }
    else {
        if (!(Test-Path -Path $ImportFile)) {
            Write-Warning "File not found"
            break
        }
        elseif (!$ImportFile.EndsWith(".txt")) {
            Write-Warning "You can only specify a .txt file"
            break
        }
        else {
            if (!$ExportCSV) {
                $GetServiceTagFromFile = Get-Content -Path $ImportFile
                foreach ($ServiceTags in $GetServiceTagFromFile) {
                    $WarrantyObject = Get-DellWarrantyInfo -GetServiceTag $ServiceTags | Select-Object ServiceTag,SystemModel,SystemShipDate
                    $WarrantyObject[0,1] #Remove [0,1] to get everything
                }
            }
            elseif ($ExportCSV) {
                $GetServiceTagFromFile = Get-Content -Path $ImportFile
                $ExportPath = Read-Host "Enter a path to export the results"
                $ExportFileName = "WarrantyInfo.csv"
                foreach ($ServiceTags in $GetServiceTagFromFile) {
                    $WarrantyObject = Get-DellWarrantyInfo -GetServiceTag $ServiceTags | Select-Object ServiceTag,SystemModel,SystemShipDate
                    if (!(Test-Path -Path $ExportPath)) {
                        Write-Warning "Path not found"
                        break
                    }
                    else {
                        $FullExportPath = Join-Path -Path $ExportPath -ChildPath $ExportFileName
                        $WarrantyObject[0,1] | Export-Csv -Path $FullExportPath -Delimiter "," -NoTypeInformation -Append #Remove [0,1] to get everything
                    }
                }
            (Get-Content $FullExportPath) | ForEach-Object { $_ -replace '"', "" } | Out-File $FullExportPath
            Write-Output "File successfully exported to $FullExportPath"
            }
        }
    }
}

Again, all I did was modify the expanded property in the function and modified the output to put out the model and ship date. All credit goes to the original authors of the script/script revisions. Later down the road I might play around with the For-Each command and get it to do both warranty and model. Until next time.

Wednesday, September 4, 2013

Exchange 2010 - In-Depth Mailbox Migration Workflow Analysis (Optimal and post-ADMT)

Hello world,

     At the time of this writing, I'm 23 years old, but I've been working with PCs since I was 6. The majority of my background is help desk, desktop support, and break-fix work in general, with light security and network day contracts on the side. So, my first foray into real Systems Administration came in the form of a project which involved the consolidation of multiple forests at multiple sites into 1 single Active Directory forest. This is more simply know as an inter-forest migration. We were using ADMT and built-in Microsoft tools to accomplish the various migration tasks. The purpose of this article is to just share an optimal workflow for migrating mailboxes cross-forest, not to necessarily walk you through migrating a mailbox. I will link to TechNet articles as necessary so that you can explore and learn the full extent of each command. IMHO, that's the best way to learn.


     One of the very first things I was tasked to do was to explore and learn the process of effectively performing Exchange remote mailbox move requests, or inter-forest mailbox migrations. This is done best via Powershell, and the process changes depending on what version of Exchange you're migrating from/to. At an overview, the process is fairly straight forward and uses the following scripts/commands:


Prepare-MoveRequest > ADMT User Migration > New-MoveRequest


Prepare-MoveRequest is a Powershell script (cd to the C:\Program Files\Microsoft\Exchange Server\V14\Scripts directory to use it) included in Exchange 2010. It can be set to run against a global catalog domain controller in the source forest (source is a term that refers to the forest we're migrating from; the old forest), query a user object, recreates that user object, mail enables it, and carries over some important AD attributes. It will carry over aliases, proxy addresses, and it will recreate the LegacyExchangeDN attribute as an x500 address (more on this later). It also retains certain permissions to the mailbox as well. When Prepare-MoveRequest creates the user account, it will be disabled. It probably does more things that I'm not aware of.

For those who don't know, ADMT stands for the Active Directory Migration Tool and it is a GUI-based tool used to move objects in between forests or even between child and parent domains within the same forest. ADMT is required to get the user account SID history as well as the user's password and security/distribution group membership, among other things, to merge with the object that Prepare-MoveRequest makes. This will also enable the account and set the password to be required to be changed on next logon.


New-MoveRequest is a Powershell command used to initiate the move request. This command requires different syntax depending on versions of Exchange that the source and target forests (target is a term that refers to the forest we're migrating to; the new forest). Exchange 2003/2007 remote move requests require use of the -RemoteLegacy switch, which in turn affects the other necessary switches for the completion of the command. Exchange 2010 forests use the -Remote switch. Additionally, when migrating from Exchange 2003/2007 to Exchange 2010, you need to have the standard set of Active Directory ports open between both points. However, when migrating between two Exchange 2010 forests, there is a service called MRSProxy that is required to be enabled on both sides. This is much simpler as only two or three ports need to be open to get the move request queued and moved.


When running these commands, it's best to create a password variable in Powershell for each of the respective credentials switches (i.e $sourcePW = Get-Credential). Also, each of the commands can be used in conjunction with CSVs to make batch processing easier, although I just turn on logging and paste them all from Excel. Refer to the Technet articles linked above to see paramters, syntax, etc.


So again, the optimal workflow is:


Prepare-MoveRequest > ADMT User Migration > New-MoveRequest


The reason why this is optimal is that this ensures that the user's Exchange properties are copied over properly and it requires the least amount of manual intervention, which is where errors are borne. However, there may be a scenario where you can't do this optimal workflow. This was the issue with my project initially. The target Exchange forest was not fully operational at the time and the deadline for migrating the user/computers was closing. The decision was made to move forward with ADMT migrations and to come back and revisit the Exchange half afterwards. The rationalization was that existing Microsoft documentation supported it could be done, some of the project staff have done it before, and if all else fails, we could just remigrate the user accounts again in order to achieve the optimal workflow. In hindsight, this caused many more issues than I would have known at the time.

One of the reasons why the optimal workflow above is stated is that in begins with Prepare-MoveRequest. This script automatically generates a lot of different attributes from the source forest. However if you remember correctly, it also creates a MEU (mail-enabled user) object. If you are forced to migrate a user account first, then there will already be an object in the target forest. Running it now will just create a duplicate user that has the same username, but with random characters added to it. To workaround this problem, you can use the -UseLocalObject switch with the Prepare-MoveRequest script so that it instead references the user account that was previously migrated with ADMT. You don't need to type in any parameter to locate the object, the script will find it. You can confirm that the script use the proper object as opposed to creating a new MEU is by looking for the "Local recipient info merged" message or by checking for a duplicate MEU in the default Users directory in AD.


However, it will not work right away and an error will likely be generated. Because Prepare-MoveRequest wasn't run first, when you try to run it against the migrated user account, it will fail. You must first manually mail enable the account. This will make the migrated account target-able with the Prepare-MoveRequest script using the -UseLocalObject switch. I prefer to use the Enable-MailUser command to accomplish this. When you mail enable a migrated user account, make sure the email address matches their primary SMTP address on the source side. This is to make sure it matches up and is in case you have to rename users as part of your migration project (more on that at another time) as renaming users affects how the script works.


So you've mail enabled the migrated user account and successfully prepared it. At this time you can try to do the New-MoveRequest, but it will still fail, with an error about the LegacyExchangeDN attribute not being found. The simple solution to this is to use the Update-Recipient command prior to moving the mailbox so you can generate the required attribute. Be aware that a successful execution of the command will not provide any feedback in non-verbose mode. 

At this point, you can finally successfully perform a move request and the mailbox will move. However, because of the breaks in the process, proxy addresses, aliases, the LegacyExchangeDN to x500 addition, and other attributes as well as permissions will not have been carried over and will need to be regenerated manually. I don't know all of them as of yet, there are probably more.

While the importance of carrying over permissions, proxy addresses, and aliases will be obvious to most, what isn't obvious is the LegacyExchangeDN attribute to x500 address addition. When users email each other internally, Outlook users will have an autocache entry that is based on the LegacyExchangeDN on the source forest. This attribute is based on the Exchange directory structure on the source forest and/or the version of Exchange. Because we're moving from one Exchange forest/verison to another, these LegacyExchangeDN attributes are no longer valid, so what happens is that when a user emails another internal user using autocache, it will bounce. There are two ways to fix this:

- Add the LegacyExchangeDN attribute of the source account as an x500 address to the target account

- Have the user clear their Outlook autocache by deleting the single name from the popup list or clearing the entire cache via .NK2 file deletion or in the Options menu for Outlook 2010 and up.

Getting the LegacyExchangeDN attribute and appending it as an x500 address is definitely a tedious task. Fortunately, this can be scripted rather easily using Powershell.

Here some sample Powershell scripts you can use to query the source forest and get the data you need:

Get-ADUser -Filter * -SearchBase "DC=domain,DC=com" -Properties mail,LegacyExchangeDN,proxyAddresses -Server domaincontroller.domain.com | Select mail,LegacyExchangeDN, @{n='proxyAddresses';e={$_.proxyAddresses -join '; '}} | Export-CSV "C:\exchangeaddresses.csv"

This will get all of the information required to repopulate proxy addresses as well as LegacyExchangeDN-to-x500. You may have to clean up the CSV in Excel first, especially when it comes to the proxy addresses.


Once you have a clean data set, you can use the following command to regenerate the x500:



Set-Mailbox emailaddress@domain.com -EmailAddresses @{Add=’x500:/o=Domain/ou=Site/cn=Recipients/cn=emailaddress’}

I normally use the Excel CONCATENATE/Autofill method of doing this as I find it easier than scripting and debugging a Powershell command while under pressure. I put the LegacyExchangeDN in one column and the email addresses in another. Then I do a CONCATENATE function to populate the full command with the proper input. Then I do an autofill and copy/paste all of those commands into a Powershell window. This works.

Additonally, you will have to do this no matter what when it comes to mail contacts:

Get-ADObject -Filter {(mail -like "*") -and (ObjectClass -eq "contact")} -SearchBase "DC=domain,DC=com" -Properties mail,LegacyExchangeDN,proxyAddresses -Server domaincontroller.domain.com | Select mail,LegacyExchangeDN, @{n='proxyAddresses';e={$_.proxyAddresses -join '; '}} | Export-CSV "C:\exchangeaddresses.csv"

This will get the same attributes as above for mail-contacts in the source forest.

Set-MailContact emailaddress@domain.com -EmailAddresses @{Add=’x500:/o=Domain/ou=Site/cn=Recipients/cn=emailaddress’}

This is similar to the above command for mail-users.

I'll let you figure out how to convert the existing commands for getting alias data as well as setting the proxy addresses.

In summation here are the two migration workflows for inter-forest Exchange migration:

Optimal workflow:


Post-ADMT workflow:

ADMT User Migration > Enable-MailUser > Prepare-MoveRequest > Update-Recipient > New-MoveRequest 

At this point, it should be clear that unless you're forced to, the optimal workflow is the better approach.

Inaugural post

Hello world.

I'm not usually the one who would keep a blog. Frankly, I've always viewed most bloggers as self-absorbed. However, as one grows older, one changes and so one sees value in things he once didn't see, including blogs, in my case. So, I decided to make this blog for a number of reasons.:

1.) To practice my dreadful writing skills.

2.) To give the IT community something back for in exchange for all the help I've gotten through various forums, websites, and blogs. Whenever I encounter something that I think could help others, I'll be happy to share it here.

3.) To just share my thoughts on things in general.

I look forward to sharing all I know and exchanging knowledge with others.