Using Office 365 PST Ingestion Service

Posted on 6 CommentsPosted in exchange, exchange online, Office 365, pst, sharepoint

[Updated 10th Nov 2015 with tips on managing bad items in PST files]
Its been in private preview for a while, and recently entered a free preview for any Office 365 subscriber to try. So I gave it a go and have the following tips and guidance.

Preparing to upload PST files

You can upload PST files in situ from their current location on the network. There is no requirement to first copy them to a new folder for uploading. To do this requires a few things to consider, not just including running the AzCopy process with an account that can access all the content.

AzCopy is the command line tool used to copy your PST files to Azure in advance of importing them into Office 365 mailboxes. You do not need an Azure subscription to do this, and until September 2015 this is a free service. To do this in-situ upload of PST files without first copying them to a local network staging location you should include the /Pattern: property in AzCopy. This is documented in the AzCopy help but not currently in the PST Ingestion help on TechNet (https://technet.microsoft.com/library/ms.o365.cc.IngestionHelp.aspx?v=15.1.166.0&l=1). Using AzCopy without /Pattern will upload everything in the source path. As this is a PST ingestion process, you only want *.pst as the /Pattern. When this ingestion process starts to include uploads for SharePoint, then /Pattern will of course not be as useful a value to include.

In the following example, AzCopy is reading from a folder called “C:\Shares\Users” (/source:) and looking in all subdirectories (/S) and only uploading *.pst files (/Pattern:”*.pst”).


<span style="font-size: medium;">azcopy /source:"C:\Shares\Users" /Dest:https://uniqueurl.blob.core.windows.net/ingestiondata/20150101 /DestKey:uniquekey /Pattern:"*.pst" /S /V:"c:\temp\pstIngestion20150101.log"</span>

The data is uploaded to a folder called ingestiondata/20150101 in your Azure storage blog for the PST Ingestion process (notice no space after the URL and before ingestiondata as shown in TechNet). Each file is uploaded to a subfolder of this folder that matches the folder it is located on in the source. For example, if the following folder structure existed:

image

Then in Azure storage the structure would be like the following:

ingestiondata/20150101/Jenny/Outlook Files/2009/jenny2009.pst

ingestiondata/20150101/Paul/archive.pst

ingestiondata/20150101/Simon/PST Files/2009/SimonArchive.pst

ingestiondata/20150101/Simon/Archive2011.pst

Notice that the folder structure underneath the /Source: path is duplicated to Azure, and for a real world scenario, notice that Simon has two PST files in different folders. The /Pattern property of AzCopy will find both even though they may not be where you expect them to be. The 20150101 value is just a unique value that I have used (its a date) that I would change for different uploads, meaning that different uploads would never clash with an existing upload. In TechNet it suggests a name that represents the file share that you set as the source value, so that two uploads from two sources cannot over write each other. So in my example I might do an upload on a different day, and so use a different data value or I could use CUserShares as a way to represent the local upload and FileServerHome to represent \\fileserver\home. If I used FileServer/Home (changing \ for /) then I am creating additional subdirectories in Azure storage and this needs to be taken into account.

Preparing the PST Mapping File

Once the upload is complete, and note that this is best done overnight as it maximises bandwidth use, you have 30 days to import the files from Azure into your mailboxes. To do this you need to create a CSV file like the following:

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,SPFileContainer,SPManifestContainer,SPSiteUrl
Exchange,20150101/Jenny/Outlook Files/2009,jenny2009.pst,jenny@contoso.com,FALSE,Archive_jenny2009,,,
Exchange,20150101/Paul,archive.pst,paul@contoso.com,FALSE,Archive_Archive,,,
Exchange,20150101/Simon/PST Files/2009,SimonArchive.pst,simon@contoso.com,FALSE,Archive_SimonArchive,,,
Exchange,20150101/Simon,Archive2011.pst,simon@contoso.com,FALSE,Archive_Archive2011,,,

In Excel, it would look as follows:

image

This has a few important elements in it. Mainly the Name value (for the PST filename) is case sensitive which is not documented in TechNet at this time. I guess the FilePath is as well, but I did not come across that issue as I set all the case to the same as the source. The name matches the PST filename, and the FilePath matches the value after “ingestiondata” in the URL including the path the file was uploaded from. Therefore in my example for Jenny above, where the PST file was called “jenny2009.pst” and the path on the local file server was “C:\Shares\Users\Jenny\Outlook Files\2009\” and the /Source: was “C:\Shares\Users” and 20150101 was used as the value in the /Dest: following the URL, the result of the FilePath in the CSV becomes “20150101/Jenny/Outlook Files/2009”. That is, the CSV needs to have a FilePath that includes Dest (after ingestiondata) and the local source with \ changed to / and not including the /Source: value.

A second example, if I used the following AzCopy cmdline:


<span style="font-size: medium;">azcopy /source:"\\fileserver\home" /Dest:https://uniqueurl.blob.core.windows.net/ingestiondata/FileServer/home /DestKey:uniquekey /Pattern:"*.pst" /S /V:"c:\temp\pstIngestionFileServerHome.log"</span>

Then I would have FilePath values in the CSV that looked like “FileServer/home/Jenny/Outlook Files/2009” (case sensitive).

Once you upload the mapping file the PST import from Azure to the Exchange mailbox (or Archive) starts. If the PST file cannot be found then you get an error in the management console at quite shortly after starting. The error reads as follows:

Could not find source file {0}. Please correct the FilePath column in the mapping file and create a new job with the updated mapping file

Full file path

fileserver/home/Jenny/outlook files/2009/Jenny2009.pst

In the above error I have purposely set the FilePath and PST file to the wrong case as that is the cause of this error (unless you did not upload the PST or the path is completely wrong). The best source of the FilePath name comes from the AzCopy log file (set in the /V switch for AzCopy). This will show the path (not including the string value used after “ingestiondata” in the Dest switch that you need to add), but will show the full path the file was uploaded to and the correct case for this path and file.

All the best with removing PST’s from the network! Of course there is more to do that just mentioned here – you need to find them, work out who the PST’s belong to and create this mapping file accurately. There are a number of PST ingestion software companies who will do this for you. You also need to ensure that the PST’s do not contain bad items and to control the import settings for the PST import process.

To ensure there are no bad items in the PST files (or try to at least) it is recommended that you scan the PST files with SCANPST.EXE (http://support.microsoft.com/en-us/kb/272227). This tool needs to be run on all PST files that you have located before you upload them, or if bandwidth is not an issue, to upload them, import them and then process only those that fail.

Once SCANPST.EXE is complete, upload the new PST file and import it again (probably a new mapping file needed). Then also tell the PST Ingestion service that it is to continue processing items even if it finds bad items. To do this you need to configure a custom BadItemLimit once the import starts (as the current BadItemLimit default is 0, which means to fail at the first bad item. You will get “TooManyBadItemsPermanentException” errors in the import log file if you need to do this. To set the BadItemLimit use either of the following:

  1. Connect to Exchange Online via remote PowerShell
  2. Get-MailboxImportRequest | FL name, mailbox, status, whencreated, requestguid
  3. This returns a list of import requests. Look for the most recent and get the requestguid value.
  4. Set-MailboxImportRequest -Identity “request-guid-found-above” -BadItemLimit unlimited –AcceptLargeDataLoss

Or you can just set the BadItemLimit the same for all imports without looking for the latest one

  1. $all_import_requests = Get-Mailboximportrequest
    Foreach ($import_request in $all_import_requests)
    {
    Set-Mailboximportrequest -identity ($import_request).requestguid -BadItemLimit unlimited –AcceptLargeDataLoss
    }

Enabling Microsoft Rights Management in SharePoint Online

Posted on Leave a commentPosted in aadrm, active directory, cloud, IAmMEC, Office 365, policy, rms, sharepoint

This article is the fifth in a series of posts looking at Microsoft’s new Rights Management product set. In an earlier previous post we looked at turning on the feature in Office 365 and in this post we will look at protecting documents in SharePoint. This means your cloud users and will have their data protected just by saving it to a document library.

In this series of articles we will look at the following:

The items above will get lit up as the articles are released – so check back or leave a comment to the first post in the series and I will let you know when new content is added.

To enable SharePoint Online to integrate with Microsoft Rights Management you need to turn on RMS in SharePoint. You do this with the following steps:

  1. Go to service settings, click sites, and then click View site collections and manage additional settings in the SharePoint admin center:
    image
  2. Click settings and find Information Rights Management (IRM) in the list:
    image
  3. Select Use the IRM service specified in your configuration and click Refresh IRM Settings:
    image
  4. Click OK

Once this is done, you can now enable selected document libraries for RMS protection.

  1. Find the document library that you want to enforce RMS protection upon and click the PAGE tab to the top left of the SharePoint site (under the Office 365 logo).
    image
  2. Then click Library Settings:
    image
  3. If the site is not a document library, for example the picture below shows a “document center” site you will not see the Library Settings option. For these sites, navigate to the document library specifically and click the LIBRARY tab and then choose Library Settings:
    image
    image
  4. Click Information Rights Management
    image
  5. Select Restrict permissions on this library on download and add your policy title and policy description. Click SHOW OPTIONS to configure additional RMS settings on the library, and then click OK.
    image
  6. The additional options allow you to enforce restrictions to the document library such as RMS key caching (for offline use) and to allow the document to be shared with a group of users. This group must be mail enabled (or at least have an email address in its email address attribute) and be synced to the cloud.

To start using the RMS functionality in SharePoint, upload a document to this library or create a new document in the library. Then download the document again – it will now be RMS protected.

Configuring Exchange On-Premises to Use Azure Rights Management

Posted on 13 CommentsPosted in 2010, 2013, 64 bit, aadrm, ADFS, ADFS 2.0, DLP, DNS, exchange, exchange online, https, hybrid, IAmMEC, load balancer, loadbalancer, mcm, mcsm, MVP, Office 365, powershell, rms, sharepoint, warm

This article is the fifth in a series of posts looking at Microsoft’s new Rights Management product set. In an earlier previous post we looked at turning on the feature in Office 365 and in this post we will look at enabling on-premises Exchange Servers to use this cloud based RMS server. This means your cloud users and your on-premises users can shared encrypted content and as it is cloud based, you can send encrypted content to anyone even if you are not using an Office 365 mailbox.

In this series of articles we will look at the following:

The items above will get lit up as the articles are released – so check back or leave a comment to the first post in the series and I will let you know when new content is added.

Exchange Server integrates very nicely with on-premises RMS servers. To integrate Exchange on-premises with Windows Azure Rights Management you need to install a small service online that can connect Exchange on-premises to the cloud RMS service. On-premises file servers (classification) and SharePoint can also use this service to integrate themselves with cloud RMS.

You install this small service on-premises on servers that run Windows Server 2012 R2, Windows Server 2012, or Windows Server 2008 R2. After you install and configure the connector, it acts as a communications interface between the on-premises IRM-enabled servers and the cloud service. The service can be downloaded from http://www.microsoft.com/en-us/download/details.aspx?id=40839

From this download link there are three files to get onto the server you are going to use for the connector.

  • RMSConnectorSetup.exe (the connector server software)
  • GenConnectorConfig.ps1 (this automates the configuration of registry settings on your Exchange and SharePoint servers)
  • RMSConnectorAdminToolSetup_x86.exe (needed if you want to configure the connector from a 32bit client)

Once you have all this software (or that which you need) and you install it then IT and users can easily protect documents and pictures both inside your organization and outside, without having to install additional infrastructure or establish trust relationships with other organizations.

The overview of the structure of the link between on-premises and Windows Azure Rights Management is as follows:

IC721938

Notice therefore that there are some prerequisites needed. You need to have an Office 365 tenant and turn on Windows Azure Rights Management. Once you have this done you need the following:

  • Get your Office 365 tenant up and running
  • Configure Directory Synchronization between on-premises Active Directory and Windows Azure Active Directory (the Office 365 DirSync tool)
  • It is also recommended (but not required) to enable ADFS for Office 365 to avoid having to login to Windows Azure Rights Management when creating or opening protected content.
  • Install the connector
  • Prepare credentials for configuring the software.
  • Authorising the server for connecting to the service
  • Configuring load balancing to make this a highly available service
  • Configuring Exchange Server on-premises to use the connector

Installing the Connector Service

  1. You need to set up an RMS administrator. This administrator is either the a specific user object in Office 365 or all the members of a security group in Office 365.
    1. To do this start PowerShell and connect to the cloud RMS service by typing Import-Module aadrm and then Connect-AadrmService.
    2. Enter your Office 365 global administrator username and password
    3. Run Add-AadrmRoleBasedAdministrator -EmailAddress <email address> -Role “GlobalAdministrator” or Add-AadrmRoleBasedAdministrator -SecurityGroupDisplayName <group Name> -Role “ConnectorAdministrator”. If the administrator object does not have an email address then you can lookup the ObjectID in Get-MSOLUser and use that instead of the email address.
  2. Create a namespace for the connector on any DNS namespace that you own. This namespace needs to be reachable from your on-premises servers, so it could be your .local etc. AD domain namespace. For example rmsconnector.contoso.local and an IP address of the connector server or load balancer VIP that you will use for the connector.
  3. Run RMSConnectorSetup.exe on the server you wish to have as the service endpoint on premises. If you are going to make a highly available solutions, then this software needs installing on multiple machines and can be installed in parallel. Install a single RMS connector (potentially consisting of multiple servers for high availability) per Windows Azure RMS tenant. Unlike Active Directory RMS, you do not have to install an RMS connector in each forest. Select to install the software on this computer:
    IC001
  4. Read and accept the licence agreement!
  5. Enter your RMS administrator credentials as configured in the first step.
  6. Click Next to prepare the cloud for the installation of the connector.
  7. Once the cloud is ready, click Install. During the RMS installation process, all prerequisite software is validated and installed, Internet Information Services (IIS) is installed if not already present, and the connector software is installed and configured
    IC002
  8. If this is the last server that you are installing the connector service on (or the first if you are not building a highly available solution) then select Launch connector administrator console to authorize servers. If you are planning on installing more servers, do them now rather than authorising servers:
    IC003
  9. To validate the connector quickly, connect to http://<connectoraddress>/_wmcs/certification/servercertification.asmx, replacing <connectoraddress> with the server address or name that has the RMS connector installed. A successful connection displays a ServerCertificationWebService page.
  10. For and Exchange Server organization or SharePoint farm it is recommended to create a security group (one for each) that contains the security objects that Exchange or SharePoint is. This way the servers all get the rights needed for RMS with the minimal of administration interaction. Adding servers individually rather than to the group results in the same outcome, it just requires you to do more work. It is important that you authorize the correct object. For a server to use the connector, the account that runs the on-premises service (for example, Exchange or SharePoint) must be selected for authorization. For example, if the service is running as a configured service account, add the name of that service account to the list. If the service is running as Local System, add the name of the computer object (for example, SERVERNAME$).
    1. For servers that run Exchange: You must specify a security group and you can use the default group (DOMAIN\Exchange Servers) that Exchange automatically creates and maintains of all Exchange servers in the forest.
    2. For SharePoint you can use the SERVERNAME$ object, but the recommendation configuration is to run SharePoint by using a manually configured service account. For the steps for this see http://technet.microsoft.com/en-us/library/dn375964.aspx.
    3. For file servers that use File Classification Infrastructure, the associated services run as the Local System account, so you must authorize the computer account for the file servers (for example, SERVERNAME$) or a group that contains those computer accounts.
  11. Add all the required groups (or servers) to the authorization dialog and then click close. For Exchange Servers, they will get SuperUser rights to RMS (to decrypt content):
    image
    image
  12. If you are using a load balancer, then add all the IP addresses of the connector servers to the load balancer under a new virtual IP and publish it for TCP port 80 (and 443 if you want to configure it to use certificates) and equally distribute the data across all the servers. No affinity is required. Add a health check for the success of a HTTP or HTTPS connection to http://<connectoraddress>/_wmcs/certification/servercertification.asmx so that the load balancer fails over correctly in the event of connector server failure.
  13. To use SSL (HTTPS) to connect to the connector server, on each server that runs the RMS connector, install a server authentication certificate that contains the name that you will use for the connector. For example, if your RMS connector name that you defined in DNS is rmsconnector.contoso.com, deploy a server authentication certificate that contains rmsconnector.contoso.com in the certificate subject as the common name. Or, specify rmsconnector.contoso.com in the certificate alternative name as the DNS value. The certificate does not have to include the name of the server. Then in IIS, bind this certificate to the Default Web Site.
  14. Note that any certificate chains or CRL’s for the certificates in use must be reachable.
  15. If you use proxy servers to reach the internet then see http://technet.microsoft.com/en-us/library/dn375964.aspx for steps on configuring the connector servers to reach the Windows Azure Rights Management cloud via a proxy server.
  16. Finally you need to configure the Exchange or SharePoint servers on premises to use Windows Azure Active Directory via the newly installed connector.
    • To do this you can either download and run GenConnectorConfig.ps1 on the server you want to configure or use the same tool to generate Group Policy script or a registry key script that can be used to deploy across multiple servers.
    • Just run the tool and at the prompt enter the URL that you have configured in DNS for the connector followed by the parameter to make the local registry settings or the registry files or the GPO import file. Enter either http:// or https:// in front of the URL depending upon whether or not SSL is in use of the connectors IIS website.
    • For example .\GenConnectorConfig.ps1 –ConnectorUri http://rmsconnector.contoso.com -SetExchange2013 will configure a local Exchange 2013 server
  17. If you have lots of servers to configure then run the script with –CreateRegEditFiles or –CreateGPOScript along with –ConnectorUri. This will make five reg files (for Exchange 2010 or 2013, SharePoint 2010 or 2013 and the File Classification service). For the GPO option it will make one GPO import script.
  18. Note that the connector can only be used by Exchange Server 2010 SP3 RU2 or later or Exchange 2013 CU3 or later. The OS on the server also needs to be include a version of the RMS client that supports RMS Cryptographic Mode 2. This is Windows Server 2008 + KB2627272 or Windows Server 2008 R2 + KB2627273 or Windows Server 2012 or Windows Server 2012 R2.
  19. For Exchange Server you need to manually enable IRM as you would do if you had an on-premises RMS server. This is covered in http://technet.microsoft.com/en-us/library/dd351212.aspx but in brief you run Set-IRMConfiguration -InternalLicensingEnabled $true. The rest, such as transport rules and OWA and search configuration is covered in the mentioned TechNet article.
  20. Finally you can test if RMS is working with Test-IRMConfiguration –Sender billy@contoso.com. You should get a message at the end of the test saying Pass.
  21. If you have downloaded GenConnectorConfig.ps1 before May 1st 2014 then download it again, as the version before this date writes the registry keys incorrectly and you get errors such as “FAIL: Failed to verify RMS version. IRM features require AD RMS on Windows Server 2008 SP2 with the hotfixes specified in Knowledge Base article 973247” and “Microsoft.Exchange.Security.RightsManagement.RightsManagementException: Failed to get Server Info from http://rmsconnector.contoso.com/_wmcs/certification/server.asmx. —> System.Net.WebException: The request failed with HTTP status 401: Unauthorized.”. If you get these then turn of IRM, delete the “C:\ProgramData\Microsoft\DRM\Server” folder to remove old licences, delete the registry keys and run the latest version of GetConnectorConfig.ps1, refresh the RMS keys with Set-IRMConfiguration –RefreshServerCertificates and reset IIS with IISRESET.

Now you can encrypt messages on-premises using your AADRM licence and so not require RMS Server deployed locally.

Configuring Citrix Netscaler for SharePoint SSL Offloading

Posted on 5 CommentsPosted in citrix, load balancer, loadbalancer, Netscaler, sharepoint

I came across an interesting issue today and found that there was not a lot of info on the web about it, so as with lots of things on this blog I thought as it was not really noted about before I would document it here.

The scenario was SSL (HTTPS) connections from the outside of a company to their SharePoint site are required – so no HTTP connections. But inside the company it is all HTTP connections to different SharePoint sites! Therefore SharePoint has been set up such that the Citrix Netscaler is doing SSL Offloading and presenting a HTTP connection to SharePoint, but that SharePoint knows to return HTTPS in all the URL’s so that connections from outside remain working.

The problem here is that users on the outside had been typing in the host name for the site without HTTP and as the Netscaler was not listening on port 80 the connection was timing out. The client wanted the HTTP connection to redirect to the HTTPS version of the site.

So first the redirection. This is quite easy – you create a new virtual server for HTTP (TCP port 80) that is not bound to any service but is listening on the same IP address that the HTTPS virtual server is bound to. In the advanced tab of the HTTP virtual server you enter the value to redirect to if the service is unavailable. This URL should be the https:// url for the site.

Once the HTTP virtual server is configured it will appear as down as there are no services bound to it – this is fine. As the virtual server is down it will redirect to the specified URL. Note that rewriting rules can also be used to achieve the same end.

The following picture shows the HTTP and HTTPS service. The HTTPS service is listening on 443 but going to a service on port 80 – in other words SSL offloading.

Virtual Servers

And this one shows the 443 listener connecting to the HTTP service

VS-HTTPS

The problem with this is that the user connects to port 80 and is redirected to port 443, which redirects to port 80 on SharePoint and as SharePoint does not know about the SSL offload in place returns a HTTP link to the home page. The user is redirected to the home page on port 80 and is redirected to port 443, which redirects to port 80 on SharePoint and as SharePoint does not know about the SSL offload in place returns a HTTP link to the home page. The user is redirected to the home page on port 80 and is redirected to port 443, which redirects to port 80 on SharePoint and as SharePoint does not know about the SSL offload in place returns a HTTP link to the home page. The user is redirected to the home page on port 80 and is redirected to port 443, which redirects to port 80 on SharePoint and as SharePoint does not know about the SSL offload in place returns a HTTP link to the home page. The user is redirected to the home page on port 80 and… …etc. etc. Internet Explorer and Chrome just keep going around and around and never get anywhere. If you turn on the developer tools in both browsers IE will show you each redirect on the network tab. Chrome will just show it failing to connect. It was from IE that we saw the issue.

We need to tell the HTTPS virtual server that it is to add a header to the session telling SharePoint that SSL offloading is in place. This header is “front-end-https” and the value is “on”. Neither the header or value are case sensitive.

To turn on this header on the HTTP request (that is, the connection from the user to SharePoint) then you need to go to the Policies tab of the HTTPS virtual server and add a new Rewrite Request.

The following picture shows this in place, but what you need to do is add to this rewrite policy a rewrite action check that this action is working

VS-HTTPS-Rewrite-Policy

The rewrite policy needs to have a name and a new action. The action needs a name and a type. The type is INSERT_HTTP_HEADER and the header name is”front-end-https”. Note this is not case sensitive, but also is not Front_End_Https which has been used with Exchange Server in the past. The value for the header is needed in quotes and is “on”.

If you click Evaluate to check the action you will need to enter a test expression and a HTTP sequence of data. For testing purposes I use !HTTP.REQ.HEADER(“front-end-https”).EXISTS which reads to evaluate to true if the header does not exist. To test the rule enter HEAD fqdn and a line feed. The blank line is important or you get protocol errors. If the HTTP protocol text does not contain front-end-https: on then the rule will evaluate to true and add the header.

The following picture shows the action settings:

VS-HTTPS-Rewrite-Policy-Action

And the following shows the rule settings:

VS-HTTPS-Rewrite-Policy-Rule

Once your policy has been created you should be able to browse to HTTP for SharePoint, be redirected to HTTPS/SSL and have SharePoint know to offload the SSL to the load balancer, and so respond with a HTTPS link even though the connection was over port 80.