sharepointwookiee.com is now inhifistereo.com

Living in Wisconsin we get snowed in from time to time. It’s nowhere near as bad as this Russian town (Real Russian Winter) but this weekend we got about 6 inches. With ample time on my hands I figured it was time to do some internet housecleaning.

Consolidated Twitter handles

Managing @spwookiee and @inhifistereo was a lot. I have used inhifistereo as a handle since the days of AOL so I feel a pretty tight attachment to it. But spwookiee had the following. Did a swap similar to THIS and I was off and running.

Renamed Blog

Along the same lines as the Twitter handle swap, just wasn’t feeling sharepointwookiee.com. I had always planned to do something Star Wars-esque with the site but never got around to it. Additionally, I’m doing more BI and CRM work these days on top of my SharePoint work. Felt kind of weird posting CRM stuff on a blog that had SharePoint in the title. Plus, choosing another domain made the next change a little easier.

Goodbye GoDaddy, Hello WordPress on Azure

I use Windows Azure a lot at work. I fall in love a little more every time I log in to the portal. So why not use wordpress on azure for my blog hosting too? I get way more control and it’s a little cheaper too.

I wanted completely out of GoDaddy’s grip so I transferred my domains to gandi.net. I’m way happier because I get everything included in the purchase price with gandi.net that GoDaddy charges ala carte for. So take that GoDaddy:

Pros and Cons

As with any decision, there are pros and cons. I listed most of the pros above. I can think of 2 Cons. 1) Dip in viewership/following. New URL means folks will have to update their RSS readers. Hopefully I can find a way to redirect. 2) New tech. I do use Azure fairly often, but I don’t know everything about it. I’ll assume there will be a learning curve of some sort.

The whole switch took me about 8 hours over the course of 2 weeks. I did a little here and there when I had the time. But the biggest chunk happened over this weekend. Pretty happy with the results so far.

Azure Storage test drive

For the last year and a half I’ve been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I’ve basically learned all this sysadmin stuff “on the job.” I figured it would be a good idea to fill-in-the-blanks for the stuff I didn’t learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.

So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?

TO THE CLOUD!

I’ve been using Azure at work for a variety of things so I figured I’d give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.

There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.

The storage account sets up the subdomain you’ll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.

Once your storage account is up, you’ll need a container. Think of a container as a folder – only it’s not a folder – it’s a container. It holds your blobs – binary large object (i.e. your files). More on that in a bit.

Click Storage in the left-hand navigation

Click on the storage account name (my account is called “inhifistereo,” you can call your’s whatever you like)

Click containers at the top of the page > then click New at the bottom of the page


Now give the container a name and choose Public or Private

Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that’s fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we’re good to go.

So a container is a container – like a folder, only it’s a container. And a blob is a file. The part that took me a second to understand is this storage isn’t like a fileshare up in the cloud. It’s the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big ‘ole file that sits within the container.

Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: . There are other tools too:

The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I’m not doing this for a living or every day so the time is no big deal. I’m charged by the GB not the minute, so if it took several days no biggie. But I’m not getting any younger…

Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.

“But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?” Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don’t get charged anymore. I don’t plan on ever using these backups so they’re cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren’t really setup for consumer usage. But I’m not your typical consumer.

I’ll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I’m feeling really ambitious I may try doing a console app.

Price-wise, I’ve been charged a total of 15 US cents so far. I may have to go raid the couch cushions…

SharePoint on Azure: several lessons learned

We got back from Convergence on Friday. I had a good time overall; good food, good times, crazy things to see. Never did make it to Acme Oyster House (sorry dad!).

Meanwhile, we’ve migrated a public SharePoint site to Azure. I should say we’ve migrated a SharePoint 2010 internet site to SharePoint 2013 running in the 14-hive. Many of my colleagues throughout the SharePoint community I’ve spoke to have this to say about that:

In the last few weeks we’ve learned some valuable lessons. Here they are (in no particular order):

  • Search scopes – I chose to not migrate the Search databases because migrating the content db was hard enough. Plus, when we complete the upgrade to 2013 I would just have 1 DB to focus on. In doing so, I don’t have scopes anymore since they’ve been deprecated in SharePoint 2013. You can’t even use PowerShell to add them. The fix was to revert back to the All Sites scope. It isn’t the end of the world though because the Search Service App is smart enough to see the variation you’re searching from and serve up that site’s content. For example, if you search for bikes on the German site, you’ll get German site content back to you, rather than UK or US content.
  • SharePoint Designer – Someone wanted a quick change to a page layout. Good news here is that only 1 page in the overall site uses that page layout. SharePoint Designer 2010 works but I was unable to add another web part zone the page layout. As a workaround I added the HTML directly to the page layout. Again, not the end of the world here, but it definitely isn’t what I would want to do. Adding a web part zone would have allowed me to drop additional content in the future or replace it altogether from the Edit Page rather than editing the Page layout.
  • compat.Browser config – We never bothered with mobile sites in SharePoint 2010 with this site. All we did was turn off the mobile browsers in the compat.browser config. I did the same for SharePoint 2013 (set all ismobiledevice to false) then reset IIS. However, this did not result in success. Got hit with a vague SharePoint error. I started looking on the interwebs for help and ran across this: LINK. Followed Option 2 and hit pay dirt. It does seem out there to have to drop a statement in to the overall web.config but I had to get things working. Once we migrate to SharePoint 2013 we’ll remove the statement and make use of device channels but since we’re still running in the 14-hive I don’t get that functionality quite yet.
  • Azure IaaS growing pains – IaaS is still in preview, and thus you’re subject to wonkiness and issues beyond your control. A few days ago we experienced an outage on the site. SharePoint couldn’t talk to SQL for some reason. Logged on the SQL box and couldn’t even connect to the SQL instance. The service was running but still no dice. Well, time for a restart and Yahtzee! everything was better. I went through the logs as best I could but had no idea what I was looking for. Come to find out Azure pushed down restarts. In doing so, SQL came back before the AD instance did so nothing was authenticating properly. HUGE lesson learned there. Best way to overcome this is to start using Availability Sets: link to documentation.
  • Calculated Column issues – had a user come to me with this one. The user noticed that a calculated column was throwing a string of characters into the column. I went and checked the column settings and miraculously the issue was gone. I edited another item and got the exact same string again. So it would appear that it wasn’t the formula but rather something was going on in the DB. Luckily I was at Convergence and there were a handful of SharePoint support folks in the Expo hall. Ran by their booth and showed them the issue. Thankfully this is a known issue and installing the March PU will fix everything: link to PU. One other note, according to the KB, you MUST install this PU if you ever hope to install a future CU.

That’s about it for now.