Administrators, AADConnect and AdminSDHolder Issues (or why are some accounts having permission-issue)

Posted on 1 CommentPosted in AADConnect, AADSync, active directory, AdminSDHolder, dirsync, exchange, exchange online, hybrid, Office 365

AdminSDHolder is something I come across a lot, but find a lot of admins are unaware of it. In brief it is any user that is a member of a protected group (i.e. Domain Admins) will find that their AD permission inheritance and access control lists on their AD object will be reset every hour. Michael B. Smith did a nice write-up on this subject here.

AdminSDHolder is an AD object that determines what the permissions for all protected group members need to be. Why this matters with AADConnect and your sync to Azure Active Directory (i.e. the directory used by Office 365) is that any object that the AADConnect service cannot read cannot be synced, and any object that the AADConnect service cannot write to can be targeted by writeback permissions.

For the read permissions this is less of an issue, as the default read permissions by every object is part of a standard Active Directory deployment and so you will find that AdminSDHolder contains this permission and therefore protected objects can be read by AADConnect. This happens in reality becase Authenticated Users have read permissions to lots of attributes on the AdminSDHolder object under the hidden System containing in the domain. Unless your AD permissions are very locked down or AdminSDHolder permissions have been changed to remove Authenticated Users you should have no issue in syncing admin accounts, who of course might have dependencies on mailboxes and SharePoint sites etc. and so need to be synced to the cloud.

Writeback though is a different ball game. Unless you have done AADConnect with Express settings you will find that protected accounts fail during the last stage of AADConnect sync process. You often see errors in the Export profile for your Active Directory that list your admin accounts. Ofter the easiest way to fix this is to enable the Inheritance permission check box on the user account and sync again. The changes are now successfully written but within the hour this inheritance checkbox will be removed and the default permissions as set on AdminSDHolder reapplied to these user accounts. Later changes that need written back from the cloud will result in a failure to writeback again, and again permission issues will be to blame.

To fix this we just need to ensure that the AdminSDHolder object has the correct permissions needed. This is nothing more than doing what the AADConnect Express wizard will do for you anyway, but if you don’t do the Express wizard I don’t think I have seen what you should do documented anywhere – so this is the first (maybe).

Often if you don’t run Express settings you are interested in the principal of least privilege and so the rest of this blog post will outline what you will see in your Active Directory and what to do to ensure protected accounts will always sync and writeback in the Azure Active Directory sync engine. I covered the permissions to enable various types of writeback permissions in a different blog post, but the scripts in this post never added the correct write permissions to AdminSDHolder, so this post will cover what to do for your protected accounts.

First, take a look at any protected account (i.e. one that is a member of Domain Admins):
image

You will see in the Advanced permissions dialog that their is an “Enable Inheritance” button (or a check box is unchecked in older versions of Active Directory. You will also notice that all the permissions under the “Inherited From” column read “None” – that is there are no permissions inherited. You will also see, as shown in the above dialog, that if Express settings have been run for your AADConnect sync service that a access control entry for the AADConnect service account will be listed – here this is MSOL_924f68d9ff1f (yours will be different if it exists) and has read/write for everything. This is not least privilege! If you have run the sync engine previously on different servers and later removed them (as the sync engine can only run on one server to one AAD tenant, excluding staging servers) then you might see more than one MSOL account. The description field of the account will show what server it was created on for your information.

If you compare your above admin account to a non-protected account you will see inheritance can be disabled and that the Inherited From column lists the source of the permission inheritance.

Compare the access control entries (ACE) to the list of ACE’s on the AdminSDHolder object. AdminSDHolder can be found at CN=AdminSDHolder,CN=System,DC=domain,DC=local. You should find that the protected accounts match those of the AdminSDHolder, or at least will within the hour as someone could have just changed something.

Add a permission ACE to AdminSDHolder and it will appear on each protected account within an hour, remove an ACE and it will go within the hour as well. So you could for example remove the MSOL_ account(s) from older ADSync deployments and tidy up your permissions as well.

This is what my Advanced permissions for AdminSDHolder looks like on my domain

image

If I add the relevant ACE’s here for the writeback permissions then within the hour, and then for syncs that happen after that time, the errors for writeback in the sync management console will go away. Note though that AdminSDHolder is per domain, so if you are syncing more than one domain you need to set these permissions on each domain.

To script these permissions, run the following in PowerShell to update AD permissions regarding to the different hybrid writebacks scenarios that you are interested in implementing.

Finding All Your AdminSDHolder Affected Users

The following PowerShell will let you know all the users in your domain who have an AdminCount set to 1 (>0 in reality), which means they are impacted by AdminSDHolder restrictions. The changes below directly on the AdminSDHolder will impact these users as their permissions will get updated to allow writeback from Azure AD.

get-aduser -Filter {admincount -gt 0} -Properties adminCount -ResultSetSize $null | FT DistinguishedName,Enabled,SamAccountName

Password Writeback

The following PowerShell will modify the permissions on the AdminSDHolder object so that protected accounts can have Self Service Password Reset (SSPR) function against the accounts. Note you need to change the DC values in the script for it to function against your domain(s).

To determine the account name that permissions must be granted to, open the Synchronization Service Manager on the sync server, click Connectors and double click the connector to the domain you are updating. Under the Connect to Active Directory Forest item you will see the Forest Name and User Name. The User Name is the name of the account you need in the script. An example is shown below:

image

$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account usually in the form of AAD_number or MSOL_number].
$AdminSDHolder = "CN=AdminSDHolder,CN=System,DC=contoso,DC=com"

$cmd = "dsacls.exe '$AdminSDHolder' /G '`"$accountName`":CA;`"Reset Password`"'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls.exe '$AdminSDHolder' /G '`"$accountName`":CA;`"Change Password`"'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls.exe '$AdminSDHolder' /G '`"$accountName`":WP;lockoutTime'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls.exe '$AdminSDHolder' /G '`"$accountName`":WP;pwdLastSet'"
Invoke-Expression $cmd | Out-Null

Exchange Hybrid Mode Writeback

The below script will set the permissions required for the service account that AADSync uses. Note that if Express mode has been used, then an account called MSOL_AD_Sync_RichCoexistence will exist that has these permissions rather than being assigned directly to the sync account. Therefore you could change the below permissions to utilise MSOL_AD_Sync_RichCoexistence rather than AAD_ or MSOL_ and achieve the same results, but knowing that future changes to the MSOL_ or AAD_ account will be saved as it was done via a group.

The final permission in the set is for msDS-ExternalDirectoryObjectID and this is part of the Exchange Server 2016 (and maybe Exchange Server 2013 later CU’s) schema updates. Newer documentation on AAD Connect synchronized attributes already has this attribute listed, for example in Azure AD Connect sync: Attributes synchronized to Azure Active Directory

$accountName = "domain\aad_account"
$AdminSDHolder = "CN=AdminSDHolder,CN=System,DC=contoso,DC=com"

$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;proxyAddresses'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchUCVoiceMailSettings'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchUserHoldPolicies'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchArchiveStatus'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchSafeSendersHash'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchBlockedSendersHash'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msExchSafeRecipientsHash'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$AdminSDHolder' /G '`"$accountName`":WP;msDS-ExternalDirectoryObjectID'"
Invoke-Expression $cmd | Out-Null

Once these two scripts are run against AdminSDHolder object and you wait an hour, the permissions will be applied to your protected accounts, then within 30 minutes (based on the default sync time) any admin account that is failing to get cloud settings written back to Active Directory due to permission-issue errors will automatically get resolved.

Exchange Edge Server and Common Attachment Blocking In Exchange Online Protection

Posted on Leave a commentPosted in 2007, 2010, 2013, 2016, Edge, EOP, exchange, exchange online, Exchange Online Protection, FOPE, IAmMEC, Office 365

Both Exchange Server Edge role and Exchange Online Protection have an attachment filtering policy. The default in Edge Server is quite long, and the default in EOP is quite short. There is also a few values that are common to both.

So, how do you merge the lists so that your Edge Server attachment filtering policy is copied to Exchange Online in advance of changing your MX record to EOP?

You run

Set-MalwareFilterPolicy Default -FileTypes ade,adp,cpl,app,bas,asx,bat,chm,cmd,com,crt,csh,exe,fxp,hlp,hta,inf,ins,isp,js,jse,ksh,lnk,mda,mdb,mde,mdt,mdw,mdz,msc,msi,msp,mst,ops,pcd,pif,prf,prg,ps1,ps11,ps11xml,ps1xml,ps2,ps2xml,psc1,psc2,reg,scf,scr,sct,shb,shs,url,vb,vbe,vbs,wsc,wsf,wsh,xnk,ace,ani,docm,jar

This takes both the Edge Server default list and the EOP default list, minus the duplicate values and adds them to EOP. If you have a different custom list then use the following PowerShell to get your two lists and then use the above (with “Default” being the name of the policy) PowerShell to update the list in the cloud

Edge Server: Get-AttachmentFilterEntry

EOP: $variable = Get-MalwareFilterPolicy Default
$variable.FileTypes

Exchange Online Archive–Counting Archives

Posted on Leave a commentPosted in archive, exchange, exchange online, Exchange Server, EXO, IAmMEC, Office 365

If you are using Exchange Online Archive and what to get a count of the number of users with an archive, or a list of the users with an archive, then the following PowerShell scripts will give you this info:

List all users with an Exchange Online Archive:

Get-MailUser -ResultSize Unlimited | where {$_.ArchiveName -ilike “In-Place Archive*”}

Count all users with an Exchange Online Archive:

(Get-MailUser -ResultSize Unlimited | where {$_.ArchiveName -ilike “In-Place Archive*”}).Count

Both of these PowerShell cmdlets need to be run in Exchange Online via Remote PowerShell.

Exchange Server and Missing Root Certificates

Posted on Leave a commentPosted in 2007, 2010, 2013, exchange, exchange online, Exchange Server, federation, Free/Busy

I came across an issue with a clients Exchange Server deployment today that is not well documented – or rather it is, but you need to know where to look. So I thought I would document the troubleshooting steps and the fix here.

We specifically came across this error when testing Free/Busy for an Office 365 migration, though it could happen for a variety of reasons. Free/Busy and other lookups in a cross-forest Exchange Server deployment require a working organization configuration and this was failing. Running Test-FederationTrust (a prerequisite of the organization relationship) in verbose mode (add -Verbose to the end) returned the following:

Unable to retrieve federation metadata from the security token
service. Reason: Microsoft.Exchange.Management.FederationProvisioning.FederationMetadataException: Unable to access the
Federation Metadata document from the federation partner. Detailed information: “The underlying connection was closed:
Could not establish trust relationship for the SSL/TLS secure channel.”.

The final result of the test will also show two errors for “Unable to retrieve federation metadata from the security token service.” and “Failed to request delegation token.”

The last part of the verbose error is the clue here. The server in question is unable to make an SSL/TLS connection to the endpoint that the federation trust needs to reach to get the federation trust metadata. That endpoint is listed right at the start of the Verbose output. It reads:

VERBOSE: [16:53:08.306 GMT] Test-FederationTrust : Requesting Federation Metadata from
https://nexus.microsoftonline-p.com/FederationMetadata/2006-12/FederationMetadata.xml.

Now that we have a URL and an error message, check that the URL is reachable from each of your Exchange Servers. At my client today we found one server could not successfully reach this endpoint without an SSL error turning up in the browser. The problem was that the certificate that the endpoint is secure with is issued by the Baltimore Cybertrust Root Certificate – one that Microsoft uses for lots of services, but the root certificate was not installed on that machine. Lots of root certs where missing from that machine as it had never had a root certificate update applied to it.

We installed the latest Root Certificate Update and then the federation trust worked and free/busy etc. (mail tips, cross-forest message tracking etc.) all worked fine.

Qualifications in Exchange Signatures

Posted on Leave a commentPosted in 2013, active directory, exchange, Exchange Server, Global Catalog, IAmMEC, iQ.Suite

In a recent project I was working with iQ.Suite from GBS and specifically the component of this software that add signatures to emails. The client are an international organization with users in different geographies and we needed to accommodate the users qualifications in their email signature.

The problem with this is that in Germany qualifications are written in front of the name and in the USA at the end and in other countries at the start and the end. We were doing a Notes to Exchange migration and in Notes the iQ.Suite signature software read data from Notes that was originally pulled from Active Directory, and so the client had placed the qualifications in the DisplayName field in the Active Directory.

But when we migrated to Exchange Server the Global Address List listed the users DisplayName an so the German users where all listed together with “Dipl” as the first characters of their name. Also the name the email came from was written like this. The signature worked, but the other changes that became apparent meant we had to work out a different way to look at this problem.

So rather than using DisplayName for the users name and qualifications, we used personalPrefix in Active Directory to store anything needed before their name (Dipl in the above German example, and Prof or Dr being English examples) and the generationQualifier Active Directory attribute to store any string that followed the users DisplayName (such as Jr in the USA or BSc for qualifications etc.)

In iQ.Suite we created a signature that looked like the following. This has a conditional [COND] entry for personalTitle, displayName and generationQualifier. That is if each of these are present, then show the displayName with personalTitle before it and generationQualifier after it. If the user does not have values for these fields, do not show them. The [COND] control is documented in iQ.Suite.

[COND]personalTitle;[VAR]personalTitle[/VAR] [/COND][COND]displayName;[VAR]displayName[/VAR][/COND][COND]generationQualifier; [VAR]generationQualifier[/VAR][/COND]

What was not so well documented, and why I wanted to write this blog entry was that the personalTitle and generationQualifier attributes are not stored in the Global Catalog and so are missing in the users signature. In the multi-domain deployment we had at the client, iQ.Suite read the personalTitle, displayName and generationQualifier Active Directory attributes from the Global Catalog as Exchange was installed in a resource domain and the users in separate domains and so unless the attribute was pushed to the Global Catalog it was not seen by iQ.Suite.

To promote an attribute to be visible in the Global Catalog you need to open the Schema Management MMC snap-in, find the attributes of question and tick the Replicate this attribute to the Global Catalog field. This is outlined in https://technet.microsoft.com/en-us/library/cc737521(v=ws.10).aspx.

Configuring Writeback Permissions in Active Directory for Azure Active Directory Sync

Posted on 23 CommentsPosted in 2008, 2008 R2, 2012, 2012 R2, active directory, ADFS 3.0, Azure, Azure Active Directory, cloud, exchange, exchange online, groups, hybrid, IAmMEC, Office 365, WAP, Web Application Proxy, windows

[Update November 2015 – User Writeback has been pulled from preview, so latest versions of AADConnect do not offer this option – it will probably return, so content about it remains in this blog]

[Update September 2016 – added new attributes as AADConnect now syncs more stuff, so updated scripts to match published changes]

[Update March 2017 – added another blog post on using the below to fix permission-issue errors on admin and other protected accounts at http://c7solutions.com/2017/03/administrators-aadconnect-and-adminsdholder-issues]

Azure Active Directory has been long the read-only cousin of Active Directory for those Office 365 and Azure users who sync their directory from Active Directory to Azure Active Directory apart from eight attributes for Exchange Server hybrid mode. Not any more. Azure Active Directory writeback is now available and in preview for some of the writeback types at the time of writing. This enabled objects to be mastered or changed in Azure Active Directory and written back to on-premises Active Directory.

This writeback includes:

  • Devices that can be enrolled with Office 365 MDM or Intune, which will allow login to AD FS controlled resources based on user and the device they are on
  • “Modern Groups” in Office 365 can be written back to on-premises Exchange Server 2013 CU8 or later hybrid mode and appear as mail enabled distribution lists on premises. Does not require AAD Premium licences
  • Users can change their passwords via the login page or user settings in Office 365 and have that password written back online.
  • Exchange Server hybrid writeback is the classic writeback from Azure AD and is the apart from Group Writeback is the only one of these writebacks that does not require Azure AD Premium licences.
  • User writeback from Azure AD (i.e. users made in Office 365 in the cloud for example) to on-premises Active Directory
  • Windows 10 devices for “Azure AD Domain Join” functionality

All of these features (apart from Exchange Hybrid writeback) require AADSync and not DirSync. Install and run the AADConnect program to migrate from DirSync to AADSync and then in the Synchronization Options on rerunning the AADConnect wizard you can add all these writeback functions.

Preparing for Device Writeback

If you do not have a 2012 R2 or later domain controller then you need to update the schema of your forest. Do this by getting a Windows Server 2012 R2 ISO image and mounting it as a drive. Copy the support/adprep folder from this image or DVD to a 64 bit domain member in the same site as the Schema Master. Then run adprep /forestprep from an admin cmd prompt when logged in as a Schema Admin. The domain member needs to be a 64 bit domain joined machine for adprep.exe to run.

Wait for the schema changes to replicate around the network.

Import the cmdlets needed to configure your Active Directory for writeback by running Import-Module ‘C:\Program Files\Microsoft Azure Active Directory Connect\AdPrep\AdSyncPrep.psm1’ from an administrative PowerShell session. You need Azure AD Global Admin and Enterprise Admin permissions for Azure and local AD forest respectively. The cmdlets for this are obtained by running the Azure AD Connect tool.


$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account in the form of AAD_number].
Initialize-ADSyncDeviceWriteBack -AdConnectorAccount $accountName -DomainName contoso.com #[domain where devices will be created].

This will create the “Device Registration Services” node in the Configuration partition of your forest as shown:

image

To see this, open Active Directory Sites and Services and from the View menu select Show Services Node. Also in the domain partition you should now see an OU called RegisteredDevices. The AADSync account now has permissions to write objects to this container as well.

In Azure AD Connect, if you get the error “This feature is disabled because there is no eligible forest with appropriate permissions for device writeback” then you need to complete the steps in this section and click Previous in the AADConnect wizard to go back to the “Connect your directories” page and then you can click Next to return to the “Optional features” page. This time the Device Writeback option will not be greyed out.

Device Writeback needs a 2012 R2 or later AD FS server and WAP to make use of the device info in the Active Directory (for example, conditional access to resources based on the user and the device they are using). Once Device Writeback is prepared for with these cmdlets and the AADConnect Synchronization Options page is enabled for Device Writeback then the following will appear in Active Directory:

image

Not shown in the above, but adding the Display Name column in Active Directory Users and Computers tells you the device name. The registered owner and registered users of the device are available to view, but as they are SID values, they are not really readable.

If you have multiple forests, then you need add the SCP record for the tenant name in each separate forest. The above will do it for the forest AADConnect is installed in and the below script can be used to add the SCP to other forests:

<div>$verifiedDomain = "contoso.com"    # Replace this with any of your verified domain names in Azure AD
$tenantID = "72f988bf-86f1-41af-91ab-2d7cd011db47"    # Replace this with you tenant ID
$configNC = "CN=Configuration,DC=corp,DC=contoso,DC=com"    # Replace this with your AD configuration naming context</div>
<div>$de = New-Object System.DirectoryServices.DirectoryEntry
$de.Path = "LDAP://CN=Services," + $configNC</div>
<div>$deDRC = $de.Children.Add("CN=Device Registration Configuration", "container")
$deDRC.CommitChanges()</div>
<div>$deSCP = $deDRC.Children.Add("CN=62a0ff2e-97b9-4513-943f-0d221bd30080", "serviceConnectionPoint")
$deSCP.Properties["keywords"].Add("azureADName:" + $verifiedDomain)
$deSCP.Properties["keywords"].Add("azureADId:" + $tenantID)</div>
<div>$deSCP.CommitChanges()</div>

Preparing for Group Writeback

Writing Office 365 “Modern Groups” back to Active Directory on-premises requires Exchange Server 2013 CU8 or later schema updates and servers installed. To create the OU and permissions required for Group Writeback you need to do the following.

Import the cmdlets needed to configure your Active Directory for writeback by running Import-Module ‘C:\Program Files\Microsoft Azure Active Directory Connect\AdPrep\AdSyncPrep.psm1’ from an administrative PowerShell session. You need Domain Admin permissions for the domain in the local AD forest that you will write back groups to. The cmdlets for this are obtained by running the Azure AD Connect tool.

$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account usually in the form of AAD_number].
$cloudGroupOU = "OU=CloudGroups,DC=contoso,DC=com"
Initialize-ADSyncGroupWriteBack -AdConnectorAccount $accountName -GroupWriteBackContainerDN $cloudGroupOU

Once these cmdlets are run the AADSync account will have permissions to write objects to this OU. You can view the permissions in Active Directory Users and Computers for this OU if you enable Advanced mode in that program. There should be a permission entry for this account that is not inherited from the parent OU’s.

At the time of writing, the distribution list that is created on writeback from Azure AD will not appear in the Global Address List in Outlook etc. or allow on-premises mailboxes to send to these internal only cloud based groups. To add it to the address book you need to create a new subdomain, update public DNS and add send connectors to hybrid Exchange Server. This is all outlined in https://technet.microsoft.com/en-us/library/mt668829(v=exchg.150).aspx. This ensure’s that on-premises mailboxes can deliver to groups as internal senders and not require external senders enabled on the group. To add the group to the Global Address List you need to run Update-AddressList in Exchange Server. Once group writeback is prepared for using these cmdlets here and AADConnect has had it enabled during the Synchronization Options page, you should see the groups appearing in the selected OU as shown:

image

And you should find that on-premises users can send email to these groups as well.

Preparing for Password Writeback

The option for users to change their passwords in the cloud and have then written back to on-premises (with multifactor authentication and proof of right to change the password) is also available in Office 365 / Azure AD with the Premium Azure Active Directory or Enterprise Mobility Pack licence.

To enable password writeback for AADConnect you need to enable the Password Writeback option in AADConnect synchronization settings and then run the following three PowerShell cmdlets on the AADSync server:


Get-ADSyncConnector | fl name,AADPasswordResetConfiguration
Get-ADSyncAADPasswordResetConfiguration -Connector "contoso.onmicrosoft.com - AAD"
Set-ADSyncAADPasswordResetConfiguration -Connector "contoso.onmicrosoft.com - AAD" -Enable $true

The first of these cmdlets lists the ADSync connectors and the name and password reset state of the connector. You need the name of the AAD connector. The middle cmdlet tells you the state of password writeback on that connector and the last cmdlet enables it if needed. The name of the connector is required in these last two cmdlets.

To set the permissions on-premises for the passwords to be written back the following script is needed:

$passwordOU = "DC=contoso,DC=com" #[you can scope this down to a specific OU]
$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account usually in the form of AAD_number].

$cmd = "dsacls.exe '$passwordOU' /I:S /G '`"$accountName`":CA;`"Reset Password`";user'"
Invoke-Expression $cmd | Out-Null

$cmd = "dsacls.exe '$passwordOU' /I:S /G '`"$accountName`":CA;`"Change Password`";user'"
Invoke-Expression $cmd | Out-Null

$cmd = "dsacls.exe '$passwordOU' /I:S /G '`"$accountName`":WP;lockoutTime;user'"
Invoke-Expression $cmd | Out-Null

$cmd = "dsacls.exe '$passwordOU' /I:S /G '`"$accountName`":WP;pwdLastSet;user'"
Invoke-Expression $cmd | Out-Null

Finally you need to run the above once per domain.

Preparing for Exchange Server Hybrid Writeback

Hybrid mode in Exchange Server requires the writing back on eight attributes from Azure AD to Active Directory. The list of attributes written back is found here. The following script will set these permissions for you in the OU you select (or as shown at the root of the domain). The DirSync tool used to do all this permissioning for you, but the AADSync tool does not. Therefore scripts such as this are required. This script sets lots of permissions on these eight attributes, but for clarify on running the script the output of the script is sent to Null. Remove the “| Out-Null” from the script to see the changes as they occur (the script also takes a lot longer to run).

$accountName = "domain\aad_account"
$HybridOU = "DC=contoso,DC=com"

#Object type: user
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;proxyAddresses;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchUCVoiceMailSettings;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchUserHoldPolicies;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchArchiveStatus;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchSafeSendersHash;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchBlockedSendersHash;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchSafeRecipientsHash;user'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msDS-ExternalDirectoryObjectID;user'"
Invoke-Expression $cmd | Out-Null

#Object type: iNetOrgPerson
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;proxyAddresses;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchUCVoiceMailSettings;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchUserHoldPolicies;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchArchiveStatus;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchSafeSendersHash;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchBlockedSendersHash;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msExchSafeRecipientsHash;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;msDS-ExternalDirectoryObjectID;iNetOrgPerson'"
Invoke-Expression $cmd | Out-Null
#Object type: group
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;proxyAddresses;group'"
Invoke-Expression $cmd | Out-Null

#Object type: contact
$cmd = "dsacls '$HybridOU' /I:S /G '`"$accountName`":WP;proxyAddresses;contact'"
Invoke-Expression $cmd | Out-Null

Finally you need to run the above once per domain.

Preparing for User Writeback

Currently in preview at the time of writing, you are able to make users in Azure Active Directory (cloud users as Office 365 would call them) and write them back to on-premises Active Directory. The users password is not written back and so needs changing before the user can login on-premises.

To prepare the on-premises Active Directory to writeback user objects you need to run this script. This is contained in AdSyncPrep.psm1 and that is installed as part of Azure AD Connect. Azure AD Connect will install Azure AD Sync, which is needed to do the writeback. To load the AdSyncPrep.psm1 module into PowerShell run Import-Module ‘C:\Program Files\Microsoft Azure Active Directory Connect\AdPrep\AdSyncPrep.psm1’ from an administrative PowerShell session.

$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account usually in the form of AAD_number].
$cloudUserOU = "OU=CloudUsers,DC=contoso,DC=com"
Initialize-ADSyncUserWriteBack -AdConnectorAccount $accountName -UserWriteBackContainerDN $cloudUserOU

Once the next AADSync occurs you should see users in the OU used above that match the cloud users in Office 365 as shown:

image

Prepare for Windows 10 Registered Device Writeback Sync

Windows 10 devices that are joined to your domain can be written to Azure Active Directory as a registered device, and so conditional access rules on device ownership can be enforced. To do this you need to import the AdSyncPrep.psm1 module. This module supports the following two additional cmdlets to prepare your Active Directory for Windows 10 device sync:

CD "C:\Program Files\Microsoft Azure Active Directory Connect\AdPrep"
Import-Module .\AdSyncPrep.psm1
Initialize-ADSyncDomainJoinedComputerSync
Initialize-ADSyncNGCKeysWriteBack

These cmdlets are run as follows:

$accountName = "domain\aad_account" #[this is the account that will be used by Azure AD Connect Sync to manage objects in the directory, this is an account usually in the form of AAD_number].
$azureAdCreds = Get-Credential #[Azure Active Directory administrator account]

CD "C:\Program Files\Microsoft Azure Active Directory Connect\AdPrep"
Import-Module .\AdSyncPrep.psm1
Initialize-ADSyncDomainJoinedComputerSync -AdConnectorAccount $accountName -AzureADCredentials $azureAdCreds 
Initialize-ADSyncNGCKeysWriteBack -AdConnectorAccount $accountName 

Once complete, open Active Directory Sites and Services and from the View menu Show Services Node. Then you should see the GUID of your domain under the Device Registration Configuration container.

image

Unable To Send Exchange Quota Message

Posted on Leave a commentPosted in 2010, 2013, exchange, IAmMEC

In Exchange 2013 you can sometimes see the following event log error (MSExchange Store Driver Submission, ID 1012):

The store driver failed to submit event <id> mailbox <guid> MDB <database guid> and couldn’t generate an NDR due to exception Microsoft.Exchange.MailboxTransport.StoreDriverCommon.InvalidSenderException
   at Microsoft.Exchange.MailboxTransport.Shared.SubmissionItem.SubmissionItemUtils.CopySenderTo(SubmissionItemBase submissionItem, TransportMailItem message)
   at Microsoft.Exchange.MailboxTransport.Submission.StoreDriverSubmission.MailItemSubmitter.GenerateNdrMailItem()
   at Microsoft.Exchange.MailboxTransport.Submission.StoreDriverSubmission.MailItemSubmitter.<>c__DisplayClass1.<FailedSubmissionNdrWorker>b__0()
   at Microsoft.Exchange.MailboxTransport.StoreDriverCommon.StorageExceptionHandler.RunUnderTableBasedExceptionHandler(IMessageConverter converter, StoreDriverDelegate workerFunction).

And this will be preceded by the following event log warning (MSexchangeIS, ID 1077):

The mailbox <guid> on database <database guid> is approaching its storage limit. A notification has been sent to the user. This warning will not be sent again for at least twenty four hours.

The mailbox in both errors is the same and it occurs for mailboxes that have moved to Exchange Server 2013 from Exchange Server 2010 and are close to their mailbox quota. To fix the issue move the mailbox to a different database. The easiest way to do this is New-MoveRequest <guid> where the same GUID is used.

If you have lots of these then this is a little more time consuming, unless you get PowerShell to the rescue.

The following two cmdlets will query the last seven days of the event logs for MSexchangeIS sourced events with ID 1077, get the event log message (which contains the mailbox guid), manipulate the string containing the message and generate a text file of just the mailbox guids. The second cmdlet will run a New-MoveRequest for each mailbox listed in the text file.

Get-WinEvent -ComputerName PC1 -ProviderName MSExchangeIS | where {$_.ID -eq 1077 -AND $_.TimeCreated -gt [DateTime]::Now.AddDays(-7).Date} | select @{Name="mailbox";Expression={$_.Message.Substring(12,36)}} | ft -HideTableHeaders -AutoSize | out-file nearquota.txt

and then

Get-Content .\nearquota.txt | foreach {New-MoveRequest -Identity $_}

Make sure though that your Application event log is large enough to store more than seven days of events and then run these cmdlets, per server every seven days until the issue goes away (or over the course of say a year, move all mailboxes to different databases and that fixes it as well).

Using Office 365 PST Ingestion Service

Posted on 6 CommentsPosted in exchange, exchange online, Office 365, pst, sharepoint

[Updated 10th Nov 2015 with tips on managing bad items in PST files]
Its been in private preview for a while, and recently entered a free preview for any Office 365 subscriber to try. So I gave it a go and have the following tips and guidance.

Preparing to upload PST files

You can upload PST files in situ from their current location on the network. There is no requirement to first copy them to a new folder for uploading. To do this requires a few things to consider, not just including running the AzCopy process with an account that can access all the content.

AzCopy is the command line tool used to copy your PST files to Azure in advance of importing them into Office 365 mailboxes. You do not need an Azure subscription to do this, and until September 2015 this is a free service. To do this in-situ upload of PST files without first copying them to a local network staging location you should include the /Pattern: property in AzCopy. This is documented in the AzCopy help but not currently in the PST Ingestion help on TechNet (https://technet.microsoft.com/library/ms.o365.cc.IngestionHelp.aspx?v=15.1.166.0&l=1). Using AzCopy without /Pattern will upload everything in the source path. As this is a PST ingestion process, you only want *.pst as the /Pattern. When this ingestion process starts to include uploads for SharePoint, then /Pattern will of course not be as useful a value to include.

In the following example, AzCopy is reading from a folder called “C:\Shares\Users” (/source:) and looking in all subdirectories (/S) and only uploading *.pst files (/Pattern:”*.pst”).


<span style="font-size: medium;">azcopy /source:"C:\Shares\Users" /Dest:https://uniqueurl.blob.core.windows.net/ingestiondata/20150101 /DestKey:uniquekey /Pattern:"*.pst" /S /V:"c:\temp\pstIngestion20150101.log"</span>

The data is uploaded to a folder called ingestiondata/20150101 in your Azure storage blog for the PST Ingestion process (notice no space after the URL and before ingestiondata as shown in TechNet). Each file is uploaded to a subfolder of this folder that matches the folder it is located on in the source. For example, if the following folder structure existed:

image

Then in Azure storage the structure would be like the following:

ingestiondata/20150101/Jenny/Outlook Files/2009/jenny2009.pst

ingestiondata/20150101/Paul/archive.pst

ingestiondata/20150101/Simon/PST Files/2009/SimonArchive.pst

ingestiondata/20150101/Simon/Archive2011.pst

Notice that the folder structure underneath the /Source: path is duplicated to Azure, and for a real world scenario, notice that Simon has two PST files in different folders. The /Pattern property of AzCopy will find both even though they may not be where you expect them to be. The 20150101 value is just a unique value that I have used (its a date) that I would change for different uploads, meaning that different uploads would never clash with an existing upload. In TechNet it suggests a name that represents the file share that you set as the source value, so that two uploads from two sources cannot over write each other. So in my example I might do an upload on a different day, and so use a different data value or I could use CUserShares as a way to represent the local upload and FileServerHome to represent \\fileserver\home. If I used FileServer/Home (changing \ for /) then I am creating additional subdirectories in Azure storage and this needs to be taken into account.

Preparing the PST Mapping File

Once the upload is complete, and note that this is best done overnight as it maximises bandwidth use, you have 30 days to import the files from Azure into your mailboxes. To do this you need to create a CSV file like the following:

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,SPFileContainer,SPManifestContainer,SPSiteUrl
Exchange,20150101/Jenny/Outlook Files/2009,jenny2009.pst,jenny@contoso.com,FALSE,Archive_jenny2009,,,
Exchange,20150101/Paul,archive.pst,paul@contoso.com,FALSE,Archive_Archive,,,
Exchange,20150101/Simon/PST Files/2009,SimonArchive.pst,simon@contoso.com,FALSE,Archive_SimonArchive,,,
Exchange,20150101/Simon,Archive2011.pst,simon@contoso.com,FALSE,Archive_Archive2011,,,

In Excel, it would look as follows:

image

This has a few important elements in it. Mainly the Name value (for the PST filename) is case sensitive which is not documented in TechNet at this time. I guess the FilePath is as well, but I did not come across that issue as I set all the case to the same as the source. The name matches the PST filename, and the FilePath matches the value after “ingestiondata” in the URL including the path the file was uploaded from. Therefore in my example for Jenny above, where the PST file was called “jenny2009.pst” and the path on the local file server was “C:\Shares\Users\Jenny\Outlook Files\2009\” and the /Source: was “C:\Shares\Users” and 20150101 was used as the value in the /Dest: following the URL, the result of the FilePath in the CSV becomes “20150101/Jenny/Outlook Files/2009”. That is, the CSV needs to have a FilePath that includes Dest (after ingestiondata) and the local source with \ changed to / and not including the /Source: value.

A second example, if I used the following AzCopy cmdline:


<span style="font-size: medium;">azcopy /source:"\\fileserver\home" /Dest:https://uniqueurl.blob.core.windows.net/ingestiondata/FileServer/home /DestKey:uniquekey /Pattern:"*.pst" /S /V:"c:\temp\pstIngestionFileServerHome.log"</span>

Then I would have FilePath values in the CSV that looked like “FileServer/home/Jenny/Outlook Files/2009” (case sensitive).

Once you upload the mapping file the PST import from Azure to the Exchange mailbox (or Archive) starts. If the PST file cannot be found then you get an error in the management console at quite shortly after starting. The error reads as follows:

Could not find source file {0}. Please correct the FilePath column in the mapping file and create a new job with the updated mapping file

Full file path

fileserver/home/Jenny/outlook files/2009/Jenny2009.pst

In the above error I have purposely set the FilePath and PST file to the wrong case as that is the cause of this error (unless you did not upload the PST or the path is completely wrong). The best source of the FilePath name comes from the AzCopy log file (set in the /V switch for AzCopy). This will show the path (not including the string value used after “ingestiondata” in the Dest switch that you need to add), but will show the full path the file was uploaded to and the correct case for this path and file.

All the best with removing PST’s from the network! Of course there is more to do that just mentioned here – you need to find them, work out who the PST’s belong to and create this mapping file accurately. There are a number of PST ingestion software companies who will do this for you. You also need to ensure that the PST’s do not contain bad items and to control the import settings for the PST import process.

To ensure there are no bad items in the PST files (or try to at least) it is recommended that you scan the PST files with SCANPST.EXE (http://support.microsoft.com/en-us/kb/272227). This tool needs to be run on all PST files that you have located before you upload them, or if bandwidth is not an issue, to upload them, import them and then process only those that fail.

Once SCANPST.EXE is complete, upload the new PST file and import it again (probably a new mapping file needed). Then also tell the PST Ingestion service that it is to continue processing items even if it finds bad items. To do this you need to configure a custom BadItemLimit once the import starts (as the current BadItemLimit default is 0, which means to fail at the first bad item. You will get “TooManyBadItemsPermanentException” errors in the import log file if you need to do this. To set the BadItemLimit use either of the following:

  1. Connect to Exchange Online via remote PowerShell
  2. Get-MailboxImportRequest | FL name, mailbox, status, whencreated, requestguid
  3. This returns a list of import requests. Look for the most recent and get the requestguid value.
  4. Set-MailboxImportRequest -Identity “request-guid-found-above” -BadItemLimit unlimited –AcceptLargeDataLoss

Or you can just set the BadItemLimit the same for all imports without looking for the latest one

  1. $all_import_requests = Get-Mailboximportrequest
    Foreach ($import_request in $all_import_requests)
    {
    Set-Mailboximportrequest -identity ($import_request).requestguid -BadItemLimit unlimited –AcceptLargeDataLoss
    }

Managing Office 365 Groups With Remote PowerShell

Posted on 2 CommentsPosted in Azure, cloud, exchange, exchange online, groups, IAmMEC, mcm, mcsm, MVP, Office 365, owa, powershell

Announced during Microsoft Ignite 2015, there are now PowerShell administration cmdlets available for the administration of the Groups feature in Office 365.

The cmdlets are all based around “UnifedGroups”, for example Get-UnifiedGroups.

Create a Group

Use New-UnifiedGroup to do this. An example would be New-UnifiedGroup -DisplayName “Sales” -Alias sales –EmailAddress sales@contoso.com

The use of the EmailAddress parameter is useful as it allows you to set a group that is not given an email address based on your default domain, but from one of the other domains in your Office 365 tenant.

Modify a Groups Settings

Use Set-UnifiedGroup to change settings such as the ability to receive emails from outside the tenant (RequireSenderAuthenticationEnabled would be $false), limit email from a whitelist (AcceptMessagesOnlyFromSendersOrMembers) and other Exchange distribution list settings such as hidden from address lists, mail tips and the like. AutoSubscribeNewMembers can be used to tell the group to email all new messages to all new members, PrimarySmtpAddress to change the email address that the group sends from.

Remove a Group

This is the new Remove-UnifiedGroup cmdlet.

Add Members to a Group

This cmdlet is Add-UnifiedGroupLinks. For example Add-UnifiedGroupLinks sales -LinkType members -Links brian,nicolas will add the two names members to the group. The LinkType value can be members as shown, but also “owners” and “subscribers” to add group administrators (owners) or just those who receive email sent to the group but not access to the groups content. To change members to owners you do not need to remove the members, just run something like Add-UnifiedGroupLinks sales –LinkType owners -Links brian,nicolas

You can also pipe in a user list from, for example a CSV file, to populate a group. This would read: Add-UnifiedGroupLinks sales -LinkType members -Links $users where $users = Get-Content username.csv would be run before it to populate the $users variable. The source of the variable can be anything done in PowerShell.

Remove Members from a Group

For this use Remove-UnifiedGroupLinks and mention the group name, the LinkType (member, owner or subscriber) and the user or users to remove.

To Disable Group Creation in OWA

Set-OWAMailboxPolicy is used to create a policy that is not allowed to create Groups and then users have that policy applied to them. For example Set-OWAMailboxPolicy “Students” –GroupCreationEnabled $false followed by Set-CASMailbox mary –OWAMailboxPolicy Students to stop the user “mary” creating groups. After the policy is assigned and propagates around the Office 365 service, the user can join and leave groups, but not create them.

Control Group Naming

This feature allows you to control the group name or block words from being used. This is easier to set in the Distribution Groups settings in Exchange Control Panel rather than via PowerShell. To do this EAC use Recipients > Groups and click the ellipses icon (…) and select Configure Group Naming Policy. This is the same policy for distribution groups. You can add static text to the start or end of name, as well as dynamic text such as region.

Admins creating groups are not subject to this policy, but unlike DL’s if they create groups in PowerShell the policy is also not applied and so the -IgnoreNamingPolicy switch is not required.