SharePoint 2013/SSRS 2014 Trace Logs (and SSRS 2012 too)

ReportServer trace logs (Described here: https://technet.microsoft.com/en-US/library/ms156500(v=sql.120).aspx) can get quite huge during an upgrade. This might be a good candidate to get to a secondary drive along with IIS Logs, ULS Logs, etc.

These logs are stored in the following location (And according to a friend at Microsoft this is the only place you can find the build version of SSRS your SharePoint environment is running): C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\WebServices\LogFiles

Note: They are in this location for SharePoint 2010 (In 14 Hive) and SharePoint 2013 w/ either SSRS 2012 or 2014 running in SharePoint Integrated mode. If this is a native SSRS instance and you somehow found my blog everything still applies. Just look here for the trace logs instead:

C:\Program Files\Microsoft SQL Server\MSRS12.MSSQLSERVER\Reporting Services\LogFiles

The default max logfile size is 32MB and the default retention is 14 days. These can be tweaked at the following location:

C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\WebServices\Reporting\rsreportserver.config

You could potentially add a custom property called Directory and set it to your secondary drive. Here are those default values I talked about:
<RStrace>
<add name=”FileName” value=”ReportServerService” />
<add name=”FileSizeLimitMb” value=”32″ />
<add name=”KeepFilesForDays” value=”14″ />

SharePoint 2013/SSRS 2014 – Error Activating Reporting Services Integration Feature

I was contacted about a BI site without SSRS content types the other day. I sent them this document and we went through everything on it: https://msdn.microsoft.com/en-us/library/bb326289(v=sql.120).aspx

When trying to deactivate and reactivate the Reporting Services Integration Feature we got the following error..Bwomp

The content type with Id 0x010100C3676CDFA2F24E1D949A8BF2B06F6B8B defined in feature {e8389ec7-70fd-4179-a1c4-6fcb4342d7a0} was found in the current site collection or in a subsite.

First, we tried using brute force with some PowerShell magic:

This successfully activated the feature, but still no content types.

I then went on a PowerShell-ing magical journey to see if I could find if SharePoint was lying to me. It was…

*This script searches the entire web app to see if it can find a content type with ID 0x010100C3676CDFA2F24E1D949A8BF2B06F6B8B. The first line looks to see if the Reporting Services Integration feature is enabled anywhere else.

Since this returned nothing (And I did a few manual checks to make sure PowerShell wasn’t lying to me too. Trust issues..I know) I did some searching online and found some recommended fixes:

  • Most blogs state to use the -Force command like I states above. Even though it does successfully activate the feature…still no content types
  • Tried clearing the SharePoint Config Cache
  • Tried repairing the Reporting Services Add-In on ALL servers
  • Did a SharePoint rain dance..just kidding..maybe

Then I found this awesome official Microsoft article on the Reporting Services Add-In Installation: https://msdn.microsoft.com/en-us/library/aa905871.aspx

There is an area of this article that talks about using a two-step install to troubleshooting issues. This was the Golden Ticket…A normal repair didn’t work, but this two-step install/repair did the trick. I only needed to do the following steps on the SharePoint server running SSRS (This specific farm had 2 servers and these commands did not need to be run on the other server).

I fired up the command prompt (As Admin) and changed directories to the rsSharePoint.msi file (SSRS Add-In install file..You can get this right out of the SQL installation files or go here and grab the appropriate one: https://msdn.microsoft.com/en-us/library/gg426282.aspx#bkmk_sql11sp1)

Msiexec.exe /i rsSharePoint.msi SKIPCA=1

This popped up the Reporting Services add-in installation wizard. I clicked repair as I did before and it completed successfully. The SKIPCA=1 parameter skips installing the Reporting Services Custom Actions and puts another Install file in the %TEMP% location or C:\Users\<your name>\AppData\Local\Temp

With the same command prompt window opened I changed directories to this location and ran the following command:

.\rsCustomAction.exe /i

This is what it should look like on your end..

PowerShellforRS

After that I checked out the BI site’s site content types and look at those sexy beasts:

Content Types

SharePoint/Azure ACS Token Signing Certificate. Will you please just sign my tokens?!

Setting up Azure ACS was fun. It’s so easy to get it up/running/connected to SharePoint and you have the instant satisfaction of using Microsoft/Google/Facebook accounts to login to SharePoint. Great success! Note: Microsoft only gives you the UPN claim..which is a unique ID so when users log in it looks gross. Google and Facebook are able to pull in a lot more claims..but Microsoft is more secure in that fashion I suppose.

Anyways there is great documentation out there already on how to get rocking and rolling. Here’s a few I’ve used:

Anyways there isn’t really much documentation out there on the Token Signing Certificate. Most of the documentation out there states to use a self-signed certificate for DEV and get a certificate from a Commercial Certificate Authority for PROD. Alrighty then. Here’s the screen in Azure

ACSTokenSigningCertPage

Not knowing too much in the ADFS token signing cert space (In the past most environments I have worked with use ADCS or PKI to generate these)  I took to the interwebs.

The reason I was researching is because if I were to put in a CSR for acstenant.accesscontrol.windows.net I wouldn’t get it or it would get revoked…I don’t own windows.net. Companies like Comodo have a DCV (Domain Control Validation) questionnaire built right into the certificate purchasing process. For the self-signed cert you can use whatever you want.

I researched to see if Azure ACS could have a friendly name or DNS CName that we could pull the cert for. NOOPE!

http://stackoverflow.com/questions/16589648/can-i-have-a-friendly-name-in-an-acs-service-namespace

I found a great tool by Steve Peschka that allows you to actually export the token signing certificate right out of ACS. The ACS tenant is actually already an HTTPS site so there is a preexisting cert. SWEEET! It works like a charm too..

https://samlman.wordpress.com/2015/03/02/tool-to-get-token-signing-certificate-out-of-acs/

This specific client had their heart set on using the commercial certificate authority so I kept trucking.

The certificate for ACS is described in detail here: https://msdn.microsoft.com/en-us/library/gg185932.aspx

Alright I’m still not sure what subject name to use..until I found this forum post: https://social.msdn.microsoft.com/Forums/vstudio/en-US/0dc942cd-ced1-4d09-9f10-73e325c241a9/adfs-installation-and-token-signning-certficate?forum=Geneva

Frank Lesniak had the answer I was looking for (This was for ADFS, but still applied to ACS):

**I’m just copying his answer in here in case the forum post ever gets deleted

  1. The certificate’s key length should be at least 2048 bits.
  2. Validity period should be as long as possible (given cost), up to 5 years
  3. The signing algorithm should be either SHA-1 or SHA-256. If you need to support ADFS 1.x legacy federation, Windows 2000, Windows XP SP2, or Windows Server 2003, use SHA-1. Otherwise, for best security, use SHA-256. You may need to call your publically-trusted certificate issuer to validate the signing algorithm.
  4. Ensure that the private key is exportable
  5. Subject name does not matter… but something like adfstoken.yourdomainname.com would be a common implementation.
  6. Key usage does not matter.

The key points being #5 and #6 – ADFS does not care what you name the certificate or what kind of certificate is being used (i.e. code signing, server authentication, client authentication, etc.). My advice would be to generate a certificate however you’d normally feel comfortable doing so. For example, many of my clients use IIS to generate the certificate signing request (CSR), then submit the CSR to the commercial CA. Once you’ve loaded the certificate into the computer store, it should be available for AD FS to use.

 

In summary – It doesn’t matter! Use acstoken.yourdomain.com or if you’re already rocking a wildcard cert for everything use that.

SharePoint 2013/SSRS 2014 – HTTPException Request Timed Out

Here’s the scenario – SharePoint 2013 with SSRS 2014. This is a small 3 tier farm – 1 App (Running SSRS), 1 Web, and 1 SQL server. This farm had been running smoothly for quite some time and started sporadically receiving HTTPException Request Timed Out errors. This seemed to only be affecting 1 specific report (Largest/Most used report in the farm) as I was able to run other reports when the 1 report was acting up.

The end users would just see the typical SSRS loading screen until the 110 second timeout kicks into effect and then the use is presented with a “Request Timed Out” error with a correlation ID. In the eventvwr application log I could see this:

Process information:

   Process ID: XXX

   Process name: w3wp.exe

   Account name: Domain\App Pool Account

Exception information:

   Exception type: HttpException

   Exception message: Request timed out.

After some digging I noticed that the page file on the system had been modified to a static size of 4GB. After changing this to system managed everything started working perfectly (Note: You could also use the Microsoft recommendation of 150% RAM on the system – http://blogs.msdn.com/b/chaun/archive/2014/07/09/recommendations-for-page-files-on-sharepoint-servers.aspx). Recently, I have also seen where search crawls stop working (Running continuously for 10+ days) due to switching the page file to a very low value. Moral of the story – make sure you’re page file is large enough!

-AJB

SharePoint Upgrade – Incoming E-mail Issues

Here’s another fun scenario: SharePoint 2007 to 2010 upgrade that heavily relies on Incoming e-mail. When migrating/upgrading the content database, the incoming e-mail information is retained and you can see it by browsing to your favorite list or library of choice. Yay!..well kinda. This doesn’t work..I felt like Clark Griswold trying to light up his house on Christmas Vacation. The incoming e-mail alias is ALSO kept in the SharePoint configuration database. This means that the content database will have everything you need, but the config database is out of sync. You can fix this using a manual method of turning off and turning back on the ability for that list/library to receive e-mail…NO THANK YOU. As Russ declined to check each bulb individually..I respectfully declined that offer here as well.

PowerShell to the rescue! There is a RefreshEmailEnabledObjects() method you can use on a SPSite object to bring your SharePoint farm back in perfect harmony..Just like the old Coca Cola commercial used to say (Just a pop culture drop day today)

You can create your own script to loop through all SharePoint site collections or you don’t have to reinvent the wheel because Salaudeen Rajack at www.sharepointdiary.com has already done this for you: http://www.sharepointdiary.com/2013/08/fix-incoming-emails-not-working-issue-in-migration.html

SharePoint Foundation 2013 SP1 Bits..Diagnostic Data Provider Timer Jobs Enabled

I don’t know if this was a one-off thing, but I figured I’d share just in case. It’s even possible someone turned these jobs on without notifying anyone..though nobody has fessed up yet! If I run through another SPF13 install soon I’ll be sure to update the post.

I have confirmed that all copies of SharePoint Server 2013 do NOT enable the Data Diagnostic Timer Jobs by default. I have also confirmed that a RTM SharePoint Foundation 2013 install has the same behavior. Recently I ran through a SharePoint Foundation SP1 install (ISO pulled from VLSC)..and after a few weeks noticed the Usage Logging database was growing out of control! Looking over the timer jobs I saw that all diagnostic data provider timer jobs were turned on:

diagnostictimerjobs

That explains it..These jobs are normally disabled as they aggregate a lot of different information/logs from SharePoint and puts it into one central location/database. We usually either turn these on for “health checks” or when troubleshooting issues and want a complete snapshot of the farm. Turned them off..Trimmed up the usage data using this method: http://blogs.msdn.com/b/manhar/archive/2012/04/17/how-to-reduce-the-size-of-logging-database-or-how-to-purge-the-old-data-from-logging-database.aspx

Note: The link above cleared up only about 10GB of data..leaving me still with a gigantic Usage Logging database. Apparently there isn’t any way I could find (without SQL queries) to clear the Diagnostic Data out of the DB. It did trim some items – Page Requests, Feature Usage, etc. You could either wait for the retention period to kick in..or if the data isn’t important you can create a new Usage database and delete the old one using the Set-SPUsageApplication PowerShell cmdlet explained here: https://technet.microsoft.com/en-us/library/Ff607641.aspx

SharePoint 2013 – “2010 Mode” Site Collection Search Scopes

One migration tidbit to note when going from 2010 to 2013. Search scopes are contained in the Search Service database..NOT the content database. This means that if a site heavily relies on search scopes..and you are choosing to keep this site in “2010 Mode” (Not generally recommended, but sometimes makes sense) then you will need to upgrade the Search database as well. This is because sites running in “2010 Mode” will use existing scopes, but you cannot create new search scopes after the content database is upgraded to SharePoint 2013. Side Note – If this site collection is upgraded to SharePoint 2013 then you can use the fancy shmancy new result sources.

The search database can be upgraded using the following PowerShell cmdlet:

Restore-SPEnterpriseSearchServiceApplication

More about this cmdlet here: https://technet.microsoft.com/en-us/library/ff608131.aspx

This process is rock solid…kind of. It doesn’t give you GUIDs, but the search database names are in the following format:

  • <Search Service Application Name>_AnalyticsReportingDB
  • <Search Service Application Name>_CrawlDB

I had a DB naming format and this did NOT work for me. The search Admin DB (The one I restored) was renamed as I went through the SQL backup/restore process so it had the naming down. I used the process described here to get everything nice and clean: http://www.andrewjbillings.com/sharepoint-2013-foundation-creating-the-search-service-with-powershell-and-removing-those-pesky-guids/

Search database names were clean..search scopes were showing up. Life is good

SharePoint 2013 – Another FIPS 140-2 Adventure “The encryption type requested is not supported by the KDC”

Oh Federal Information Processing Standard (FIPS) 140-2 AKA FIPS 140-2…You got me again! See the original post here:

http://www.andrewjbillings.com/fips-compliance-keep-away-from-sharepoint/

As stated in several Official Microsoft documents, SharePoint uses several Windows encryption algorithms for computing hash values that do not comply with FIPS 140-2..therefore you CANNOT enable the FIPSAlgorithmPolicy registry key. Here’s some info:

So in this situation we found that the group policy was applied so we retracted that and SharePoint was “back up.” At this time sites were loading, but I was seeing a bunch of errors related to search and distributed cache in the eventvwr/ULS Logs:

System.ServiceModel.Security.SecurityNegotiationException: A call to SSPI failed, see inner exception. —> System.Security.Authentication.AuthenticationException: A call to SSPI failed, see inner exception. —> System.ComponentModel.Win32Exception: The encryption type requested is not supported by the KDC

This led me to the following blog post: http://blogs.msdn.com/b/openspecification/archive/2011/05/31/windows-configurations-for-kerberos-supported-encryption-type.aspx

In this scenario 3 things still needed to happen:

First, IIS Crypto was ran and set to FIPS 140-2 mode. This needed to be reverted as this blocks MD5 hashes.

Then, the AD attribute msDS-SupportedEncryptionTypes needed to be changed from 24 to 28 on all SharePoint 2013 servers. The value of 24 does NOT include MD5 hashes..which SharePoint desperately needs.

msDS-SupportedEncryptionTypes

After this was set Local Security Policy needed some tweaking (“Network Security: Configure Encryption types allowed for Kerberos”)

This was set to the following (Blocking MD5):

secpolMD5Issue

Once checking ALL boxes containing MD5 we were back up and running..Search was working and distributed cache was happy..SharePoint was happy..we were all pretty happy in fact :)

-AJB

 

SQL GDR Update Breaks SharePoint 2013/SQL 2014 SharePoint-Integrated SSRS

The other day the following patch was applied to a SharePoint server running SQL Server Reporting Services 2014:

clip_image002

Information about this GDR: https://support.microsoft.com/en-us/kb/3045324

This was all fine and dandy until we tried to run a report and got the following error:

  • An unexpected error occurred in Report Processing. (rsInternalError)

· Could not load file or assembly ‘Microsoft.ReportingServices.ProcessingObjectModel, Version=12.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. Access is denied.

After seeing an access denied error message my gut was to run the PowerShell command to re-secure resources: Initialize-SPResourceSecurity

This didn’t fix the issue..I ended up coming across the following forum post..Apparently this issue also happened in SQL 2012:  https://social.msdn.microsoft.com/Forums/sharepoint/en-US/5a34109a-4792-4983-9242-8573575bb727/sql-server-reporting-services-2012-sharepoint-integrated-mode-error?forum=sqlreportingservices

The fix was the following:

  1. Backup encryption keys for the SSRS Service Application
  2. Note any other customizations (SMTP Server, Execution Account, Administrators, etc.) and WRITE THESE DOWN..or take screenshots. Screenshots are good
  3. Delete the SSRS Service Application (Uncheck the box to delete data associated..)
  4. Create a new SSRS Service Application. I used the same name, same Report Server database, same application pool, etc.
  5. Restore the encryption key
  6. Make any changes noted in step 2

 

Everything should be back up and running