Azure Storage test drive

For the last year and a half I’ve been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I’ve basically learned all this sysadmin stuff “on the job.” I figured it would be a good idea to fill-in-the-blanks for the stuff I didn’t learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.

So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?

TO THE CLOUD!

I’ve been using Azure at work for a variety of things so I figured I’d give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.

There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.

The storage account sets up the subdomain you’ll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.

Once your storage account is up, you’ll need a container. Think of a container as a folder – only it’s not a folder – it’s a container. It holds your blobs – binary large object (i.e. your files). More on that in a bit.

Click Storage in the left-hand navigation

Click on the storage account name (my account is called “inhifistereo,” you can call your’s whatever you like)

Click containers at the top of the page > then click New at the bottom of the page


Now give the container a name and choose Public or Private

Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that’s fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we’re good to go.

So a container is a container – like a folder, only it’s a container. And a blob is a file. The part that took me a second to understand is this storage isn’t like a fileshare up in the cloud. It’s the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big ‘ole file that sits within the container.

Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: . There are other tools too:

The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I’m not doing this for a living or every day so the time is no big deal. I’m charged by the GB not the minute, so if it took several days no biggie. But I’m not getting any younger…

Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.

“But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?” Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don’t get charged anymore. I don’t plan on ever using these backups so they’re cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren’t really setup for consumer usage. But I’m not your typical consumer.

I’ll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I’m feeling really ambitious I may try doing a console app.

Price-wise, I’ve been charged a total of 15 US cents so far. I may have to go raid the couch cushions…

CRM + SharePoint * Excel Services = Epic Awesomeness

What’s the best way to get someone to eat their vegetables? Force feed them 😉

All kidding aside, CRM can be a pretty powerful tool and when people don’t want to use it we have to find creative ways to get them to use CRM. In addition to CRM usage, we have people who are absolutely married to Excel. And these aren’t your typical Excel files. These are files that legends are made of. Crazy formulas, vlookups galore; you name it, we use it. To make matters worse, they e-mail these Excel reports around all day, every day as attachments. So let’s kill two birds with one stone. We stop a big group of people from e-mailing Excel attachments and we get them to use CRM. Win-win for everyone (or at least that’s the hope).

  1. Save Excel file in an easy to find, easy to access place in SharePoint – doing this in SharePoint gives us all the doc mgmt benefits that we’ve come to know and love
  2. Configure your Excel REST API URL – I’ve made it pretty clear that I love SharePoint’s REST APIs in the past and the Excel API is no exception. You can read more about it here: http://msdn.microsoft.com/en-us/library/ee556413(v=office.14).aspx
  3. Create a new Web Resource in CRM – we’re going to iframe our Excel REST API URL call
  4. Choose Web Page (HTML) as the Type and then click “Text Editor”
  5. Click the Source tab
  6. Paste in your iframe code between the <body> tags. Your code should look like this:
    https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html
    • Note the frameborder, height, and width attributes. These are needed to eliminate the nasty border and to make scrolling work correctly. iframe’s aren’t perfect and getting them to work feels “hacky” but the user won’t know the difference and it should perform relatively seamless in all browsers.
  7. Click Publish
  8. Now, navigate to the desired dashboard and add your new Web Resource, click Save, and Publish.

Users should now see the Excel spreadsheet in their dashboard:

If users do not have access to the spreadsheet they should encounter an Error:Access Denied prompt or a blank screen depending on the browser they use.

Extra Credit

In our case, the Excel spreadsheet scrolled FOREVER. I wanted to give users a pleasurable experience but I also didn’t necessarily want them resorting to Excel on the client right away. I added a “Click to View in a separate Window” link in the iframe Web resource. Here’s what my code looked like:

<p><a href="https://dude.crm.dynamics.com/WebResources/new_iframe2">Click to View in separate Window</a></p>https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html

All HTML Web Resources are web pages, so I linked directly to the web page. But notice I linked to new_iframe2? I didn’t want users seeing “Click to View” on every page so I made an identical web resource, except I removed the hyperlink from the top, making for a seamless experience for the user. There’s all sorts of other things I could have done on the new_iframe2 page. I could have linked to the Excel Web Access or even directly to Excel itself, but we’ll leave it like that for now.

Ultimately, I’ve gotten the report builders to stop e-mailing this specific report as an attachment, and now the audience of the spreadsheet has to go to CRM to view it rather than getting it e-mailed to them. Awesome.

Obligatory SharePoint 2013 Search Powershell post

Maybe you’re still kicking the tires on SharePoint 2013 installing it for the first time. Maybe you’re in the throws of planning your migration. Maybe you’re a consultant who’s been stuck on a SharePoint 2010 project for the last 18 months. Either way, we all have to face the music sooner or later and upgrade to SharePoint 2013. When you do, you’ll have to setup a Search Service. It’s not as bad as you’d think. And if you have at least 3 servers in your farm (1 app and 2 WFEs) then this script will work for you. Without further delay:

#Config Section
$APP1 = "App1"
$WFE1 = "WFE1"
$WFE2 = "WFE1"
$SearchAppPoolName = "SearchServiceAppPool"
$SearchAppPoolAccountName = "domainSearchSvc"
$SearchServiceName = "SharePoint Search Service"
$SearchServiceProxyName = "SharePoint Search Service Proxy"
$DatabaseServer = "DBserver"
$DatabaseName = "SP_Search_AdminDB" 

#Create a Search Service Application Pool
$spAppPool = New-SPServiceApplicationPool -Name $SearchAppPoolName -Account $SearchAppPoolAccountName -Verbose 

#Start Search Service Instance on all Application Server
Start-SPEnterpriseSearchServiceInstance $App1 -ErrorAction SilentlyContinue
Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $App1 -ErrorAction SilentlyContinue

#Start Search Service Instance on WFEs
Start-SPEnterpriseSearchServiceInstance $WFE1 -ErrorAction SilentlyContinue
Start-SPEnterpriseSearchServiceInstance $WFE2 -ErrorAction SilentlyContinue

#Create Search Service Application
$ServiceApplication = New-SPEnterpriseSearchServiceApplication -Partitioned -Name $SearchServiceName -ApplicationPool $spAppPool.Name -DatabaseServer $DatabaseServer -DatabaseName $DatabaseName 

#Create Search Service Proxy
New-SPEnterpriseSearchServiceApplicationProxy -Partitioned -Name $SearchServiceProxyName -SearchApplication $ServiceApplication
$clone = $ServiceApplication.ActiveTopology.Clone()

#Set variables for component creation
$App1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $App1
$WFE1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE1
$WFE2SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE2

#Create Admin component
New-SPEnterpriseSearchAdminComponent –SearchTopology $clone -SearchServiceInstance $App1SSI 

#Create Processing component
New-SPEnterpriseSearchContentProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create Analytics Processing component
New-SPEnterpriseSearchAnalyticsProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create Crawl component
New-SPEnterpriseSearchCrawlComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create query processing component
New-SPEnterpriseSearchQueryProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Set the primary and replica index location; ensure these drives and folders exist on application servers
$PrimaryIndexLocation = "C:SPSearch"
$ReplicaIndexLocation = "C:SPSearchReplica" 

#We need two index partitions and replicas for each partition. Follow the sequence.
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 0
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 0
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 1
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 1

$clone.Activate()

#Verify Search Topology
$ssa = Get-SPEnterpriseSearchServiceApplication
Get-SPEnterpriseSearchTopology -Active -SearchApplication $ssa

This script is actually pretty basic. 90% of the services end up on your App box, while the Index partitions live on your WFEs. You could provision the service on one box or any combination you see fit. That’s the beauty of this model. For me, I like having the Index partition closest to where people will be searching (i.e. the WFEs).

The script should take anywhere from 10-30 minutes to run, maybe longer depending on your hardware. Once done, navigate to your Search Service in Central Admin and this is what you should see in the Search Topology.

Most important thing to remember when using this script is to create the C:SPSearch and C:SPSearchReplica directories on your WFEs PRIOR to running this script. The script will fail if you don’t do this and it’s a pain to cleanup after so create the directories first. I’ll probably write in a check to see if the directories exist – and if they don’t go, ahead and create them – next time I run this script when setting up an environment.

Delete user in asp.net membership provider

I have a SharePoint farm where I use the aspnet membership provider. Now and again I need to remove users due to separations, change of job, etc. so they can’t access SharePoint.

Like all of you I have to research this thing anew every time I have to do this. But no more! My future self will thank me.

Using the aspnet_Users_DeleteUser stored proc we can remove users via SSMS.

USE [database name]

EXEC [dbo].[aspnet_Users_DeleteUser]
@ApplicationName = '[Application Name]',
@UserName = '[Username]',
@TablesToDeleteFrom = 15,
@NumTablesDeletedFrom = 0

GO

@TablesToDeleteFrom can be a bit confusing when you open the stored proc up. It’s actually a bit mask. I typically have to remove users entirely so I use 15, but this blog post details out some additional options for that parameter: http://vsproblemssolved.blogspot.com/2007/01/using-sqlmembershipprovider.html

@NumTablesDeletedFrom is an input/output variable, meaning anything you put in there will get replaced by 0 in the query. You can use that parameter to inspect the output but I’m not that ambitious.

PerformancePoint 2013 and Tabular data sources #fail

I spin up my spiffy new SharePoint 2013 environment, migrate my PerformancePoint databases, and then try to hit the dashboards. I’m greeted by all kinds of errors. Take note: we’re using more and more tabular SSAS data sources at Trek.

Fast forward a few days. I open a ticket with Microsoft and begin troubleshooting. Engineer was a helpful chap. He had an idea from the get-go what the problem was, but wanted to check some environment variables before we go and start installing stuff.

Long story, short: you need to install the ADMOMD.NET SQL 2008 R2 feature pack if you want to hit tabular data sources (LINK – you’ll find the correct pack towards the bottom of the Install Instructions section). #lamesville #sql2012hasbeenoutforalmostayear

I asked the engineer to send me the Technet article stating this to which he replied “Wish I could.” Nowhere in any of Microsoft’s documentation does it state you need to install this feature pack in order to use tabular data sources in SP 2013. He did however send me this blog post so kudos to him for that:  http://blogs.technet.com/b/microsoft_in_education/archive/2013/04/29/configuring-performancepoint-in-sharepoint-2013.aspx

Pre-req installer may not progress past IIS configration.

I’m pissed. Like, in disbelief pissed. Kinda like my buddy Tracy:

Anyone seen this KB? The Products Preparation Tool in SharePoint Server 2013 may not progress past “Configuring Application Server Role, Web Server (IIS) Role”

The KB lists 2 possible workarounds: 1) Install a hotfix; or , 2) Run a whole bunch of PowerShell that requires the OS .iso readily available. I’ve tried both with little to no success. I especially see issues when it comes to the AppFabric and Distributed Cache.

So you can imagine my disbelief when the true workaround is to Install the IIS role first before running the Pre-Req installer. Do that and the pre-req installer runs just fine (at least on Server 2012). Son of a…

SharePoint on Azure: several lessons learned

We got back from Convergence on Friday. I had a good time overall; good food, good times, crazy things to see. Never did make it to Acme Oyster House (sorry dad!).

Meanwhile, we’ve migrated a public SharePoint site to Azure. I should say we’ve migrated a SharePoint 2010 internet site to SharePoint 2013 running in the 14-hive. Many of my colleagues throughout the SharePoint community I’ve spoke to have this to say about that:

In the last few weeks we’ve learned some valuable lessons. Here they are (in no particular order):

  • Search scopes – I chose to not migrate the Search databases because migrating the content db was hard enough. Plus, when we complete the upgrade to 2013 I would just have 1 DB to focus on. In doing so, I don’t have scopes anymore since they’ve been deprecated in SharePoint 2013. You can’t even use PowerShell to add them. The fix was to revert back to the All Sites scope. It isn’t the end of the world though because the Search Service App is smart enough to see the variation you’re searching from and serve up that site’s content. For example, if you search for bikes on the German site, you’ll get German site content back to you, rather than UK or US content.
  • SharePoint Designer – Someone wanted a quick change to a page layout. Good news here is that only 1 page in the overall site uses that page layout. SharePoint Designer 2010 works but I was unable to add another web part zone the page layout. As a workaround I added the HTML directly to the page layout. Again, not the end of the world here, but it definitely isn’t what I would want to do. Adding a web part zone would have allowed me to drop additional content in the future or replace it altogether from the Edit Page rather than editing the Page layout.
  • compat.Browser config – We never bothered with mobile sites in SharePoint 2010 with this site. All we did was turn off the mobile browsers in the compat.browser config. I did the same for SharePoint 2013 (set all ismobiledevice to false) then reset IIS. However, this did not result in success. Got hit with a vague SharePoint error. I started looking on the interwebs for help and ran across this: LINK. Followed Option 2 and hit pay dirt. It does seem out there to have to drop a statement in to the overall web.config but I had to get things working. Once we migrate to SharePoint 2013 we’ll remove the statement and make use of device channels but since we’re still running in the 14-hive I don’t get that functionality quite yet.
  • Azure IaaS growing pains – IaaS is still in preview, and thus you’re subject to wonkiness and issues beyond your control. A few days ago we experienced an outage on the site. SharePoint couldn’t talk to SQL for some reason. Logged on the SQL box and couldn’t even connect to the SQL instance. The service was running but still no dice. Well, time for a restart and Yahtzee! everything was better. I went through the logs as best I could but had no idea what I was looking for. Come to find out Azure pushed down restarts. In doing so, SQL came back before the AD instance did so nothing was authenticating properly. HUGE lesson learned there. Best way to overcome this is to start using Availability Sets: link to documentation.
  • Calculated Column issues – had a user come to me with this one. The user noticed that a calculated column was throwing a string of characters into the column. I went and checked the column settings and miraculously the issue was gone. I edited another item and got the exact same string again. So it would appear that it wasn’t the formula but rather something was going on in the DB. Luckily I was at Convergence and there were a handful of SharePoint support folks in the Expo hall. Ran by their booth and showed them the issue. Thankfully this is a known issue and installing the March PU will fix everything: link to PU. One other note, according to the KB, you MUST install this PU if you ever hope to install a future CU.

That’s about it for now.

#CONV13

The blog has been quite for a little too long. So much for that blog challenge (point 3 in this blog post)…

This week I’ll be in New Orleans for #CONV13 (aka Convergence 2013); the conference for the Dynamics side of Microsoft. I’m here specifically trolling for CRM content and learning all I can so I can take it back to Trek.

Today’s events so far: Steve and I left MKE at around 1:30 PM and arrived in MSY about 2 hours and change later. Baggage claim took a little while longer than expected. Hopped in a cab and 25 minutes later we’re in our hotel. Figured I’d throw the laptop up and put a blog post together before we head out for dinner.

I’m thinking Acme Oyster house tonight.

Adam Richman from Man vs. Food put down 15 dozen. I don’t think I’ll be taking on that challenge tonight. What I do know is that when Steve and I come to town you’re usually guaranteed a good time.

Overall, I’m pretty stoked for this conference. Should give me a good chance to connect with folks in this space and hopefully learn a thing or two. Might have a libation or two as well Winking smile

If you’re in town give me a shout out on Twitter @spwookiee or shoot me an e-mail at david_peterson@trekbikes.com. I’m always game for a bite and/or a beer.