Skytap vs. CloudShare

Since I got to Trek a year ago, we’ve been using cloud providers for our dev and test stuff. It’s very slick stuff and I don’t have to sit around and wait for a new box. I just log in, spin up my VMs, and off I go. Consider these providers your own personal Easy button when it comes to dev/test infrastructure.

Based on my experiences, I felt compelled to provide a comparison of the two. Please bear in mind that this is my opinion and your experiences may differ from mine. Let the cage match begin!

 

Their bread and butter is SharePoint offering pre-configured, fully licensed VM’s complete with all necessary service apps , accounts, and site collections. However, CloudShare also offers many other VM’s to choose from including: Windows Server, Exchange, Oracle, CentOS, Ubuntu and Windows 7. CloudShare deploys 2 account types: CloudShare Pro ($49/month) and CloudShare Enterprise (negotiable based on number of users). Each image is fully built and ready to use.

Pros

  • Pre-configured VM’s with absolutely no administration needed to get started. Doing so gives it much more of a PaaS feel; however, you are free to manipulate the servers in any way you like making it a full IaaS option
  • Fully licensed. No need to have an MSDN or TechNet subscription as the licensing price is already factored in to the monthly fee
  • Easy to add VMs to the localized domain, but not much control beyond that
  • Friendliest UI available
  • Flat cost for Pro plan at $49 per month / $490 per year

Cons

  • Little freedom to manipulate the network/domain. Cannot setup multiple subnets or add a firewall. Because of this CloudShare is much more geared towards application development rather than architecture testing
  • Zero support (yet) for Red Hat or Suse, which makes Hadoop deployment more difficult to accomplish
  • Limited run time. For Pro plan, images can only be on for 180 minutes at a time. There is an option to upgrade to "Always on" but the pricing appears to be exponentially more per month.
  • Easily snapshot environment
  • Difficult to import your own images. Must contact sales to do so and it is more trouble than it’s worth

 

Skytap offers complete flexibility in their offering by giving you everything from raw images to pre-built, multi-VM configurations. Monthly pricing varies based on the base package of resources as well as utilization rate. Your own OS and app licenses are required

Pros

  • Full, granular control of the environment including (but not limited to): networking, firewalls, group policy, domain administration, server administration, application administration, & user administration
  • Easy to import your own images, although can take considerable amount of time due to file sizes
  • Straightforward operation to snap a copy of all or part of a environment and rebuild within a matter of minutes
  • Big advantages in availability of build scripting and timing usage based on Google Calendar
  • Nice selection of public Skytap templates to choose from (examples include Red Hat, Suse, Windows 7, Windows Server, etc.)
  • Convenient point-to-point VPN connections are available as well Skytap offers users to federate VM’s to their own domain

Cons

  • Minor nuisance – must possess your own licenses
  • Limited shelf-life for images – environments are not meant to be permanent or "Always on"
  • Difficult to manage large environments due to the web-based management console
  • Zero system-level alerting available

Both providers work off a resource pool model, meaning you pay for a specific amount of RAM (20GB to start) which is shared amongst your VMs. CloudShare Pro offers 10 GB RAM, 300 GB disk, and 10 virtual CPUs to start. CloudShare Enterprise and Skytap are both negotiable.

 

Winner

Winner: Skytap. The flexibility offered makes this provider king. CloudShare is too one dimensional as it is 90% focused on SharePoint and SharePoint developers. While building Skytap images is far more time consuming than CloudShare, the ability to create templates means you only need to setup your environment once. Improvements for Skytap include offering fully licensed VM’s, more granular storage control, and more pre-built application and/or service configurations (Visual Studio, Exchange, Oracle, System Center, etc.).

Anyone using these services or others? Would love to hear about your experiences in the comments below.

Auto post to Yammer from SharePoint

I haven’t talked much about Yammer here, but we’re using it pretty extensively at Trek. Definitely a game changer if used correctly and often. The best part is that Yammer integrates with SharePoint, but one feature missing is that of auto posting to Yammer. While manual posting to Yammer is there, and it works pretty slick, the absence of any programmatic posting causes a barrier to adoption with some of our groups.

Originally we attempted to accomplish this via a custom SharePoint Designer Action. Seemed like the most reasonable approach to try first. We scoped it as a sandbox solution, which forced us to do a full-trust proxy since we were using email as our delivery vehicle. Needless to say, the attempt failed. Partly because of Yammer’s issues with email feature.  Things got so bad for Yammer they had to re-architect their whole approach to e-mailing. So we shelved the idea.

Fast forward a few months. Email to Yammer is working and a few groups were pushing for the functionality we had tried a few months back. Sat down with Tim and tried to logically architect this. What to do? Object model, SSIS package, Event Receiver, OData? Lots of options but which one is the best? I kept on my old path of trying to use SharePoint tools to solve this so I traveled down the client object model path. Why? Because it sounds cool.

Long story short: Epic. Fail.

Client Object Model – in this case – just isn’t really good at grabbing the Created By’s email address and plugging it into an email. But the bigger issue though was that I was focused on the solution and not the problem. Lesson learned here: when developing code, the goal should be to remove as many external dependencies as possible. The object model puts a big, fat dependency on SharePoint’s code and could possibly come back to haunt me in the future. Enter OData. Developed by Microsoft, it’s a standard data access method that isn’t going anywhere for the foreseeable future. Sweet.

Next I had to figure out how to flag items for processing and then – once processed – how do I flag it so that the app knows never to process it again. Enter custom columns and content types. Enable control of content types in your list and add 2 choice columns: SendToYammer and Processed. The choices are Yes and No. I originally tried a Yes/No column but it appears the data types are different between the two types of columns. I set the defaults on both columns to No. I then hid the Processed column in the content type so that only the app would have access to it.

Now that the SharePoint List is where I want it. Time to open up Visual Studio.  I referenced Eric White’s blog to get me started: http://blogs.msdn.com/b/ericwhite/archive/2010/12/09/getting-started-using-the-odata-rest-api-to-query-a-sharepoint-list.aspx

  1. Create a new project.  Click File -> New -> Project.  Select a directory for the project.  Set the name of the project to      PostToYammerOData.
  1. Right click on the References      node in the Solution Explorer window, and click Add Service      Reference.  Enter http://site/_vti_bin/listdata.svc for the address.  Change the namespace to      PostToYammer.

Now for the code – apologies in advance for not posting the “cleanest” code, but hey, it works:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Mail;
using PostToYammerOData.PostToYammer;
using System.Configuration;

namespace PostToYammerOData
{
    class Program
    {
        static void Main(string[] args)
        {
		//You can get the DataContext when you setup the Service Reference
			[Title of your site]DataContext dc = new [Title of your site]DataContext(new Uri("http://site/_vti_bin/listdata.svc"));
			dc.Credentials = CredentialCache.DefaultNetworkCredentials;

			var result = from d in dc.[List Title]
				where d.SendToYammerValue == "Yes" && d.ProcessedValue == "No"
					select new
						{
						//Define your columns
							Id = d.Id,
							Subject = d.Title,
							Body = d.Body,
							//may be .Email or .WorkEmail depending on your AD attribute mappings
							CreatedByEmail = d.CreatedBy.EMail
						};

                foreach (var d in result)
                {
                    //For Troubleshooting purposes
					//Console.WriteLine(d);

                    // email to yammer

					string to = "[Yammer group address]";
					string from = d.CreatedByEmail;
					MailMessage message = new MailMessage(from, to);
					message.Subject = d.Subject;
					message.Body = d.Body;
					SmtpClient smtp = new SmtpClient();
					smtp.Host = "[Exchange IP address]";
					smtp.Port = [port];
					smtp.Send(message);

                    // update by id, set processed to yes
                    var item = dc.[List Title]
                    .Where(i => i.Id == d.Id)
                    .FirstOrDefault();
                    item.ProcessedValue = "Yes";

                    dc.UpdateObject(item);
                    dc.SaveChanges();

                }
        }
    }
}

Basically the code opens the list, iterates through the list’s items and – for each item where SendToYammer is equal to “Yes” – e-mail Yammer then set the Processed column to “Yes.”

I can take my newly built and tested code and I can give it to someone to run on their desktop or I can deploy it to a Windows Server in my environment and use Task Scheduler to run the app on a scheduled basis. We chose the latter. Just make sure whoever runs the app (or if you use a Service Account like we did) has Contribute rights to the site.

All the lists I’ve attempted this with have had less than a few hundred items. I haven’t tried it with 1000 or more and we know that in some instances OData will only return 1000. Check out Tim’s post on how he overcame that issue (Link).

One last thing, why didn’t I use an Event Receiver? Well, besides the dependencies mentioned above, I can never get Event Receivers to work. Then again I try to do sandbox whenever possible which could be part of my problem and if you want to e-mail with a sandbox solution you’ll need a Full Trust Proxy, thereby defeating the purpose of a sandbox solution.

Thanks again to Tim and especially Steve for giving me a hand on this. Much appreciated.

June 2012 CU build number discrepancy

So tonight I just finished applying the June 2012 CU. I took a look in Central Admin’s “Servers in Farm” page and noticed that the build number was 14.0.6123.5000. According to a variety of SharePoint heavyweights’ blogs and MSDN, it should be 14.0.6123.5002.

I ran and reran PSConfig 3 times. Still no change in the build numbers. WTF!

Say What!?!

Event Viewer was fine. ULS was fine. The sites were fine. But I was starting to feel a twinge of panic.

Come to find out the last number in the sequence is the revision number. I ran across this TechNet post that put my mind to ease: http://social.technet.microsoft.com/Forums/en/sharepoint2010setup/thread/f004c2c8-3614-41ab-aca5-cff836d8ac5d

The poster with the answer mentions the syntax for a build number is Major (14), Minor (0), Build (6123), Revision (5000/5002). From now on I’ll pay attention to the third number in the sequence. … Microsoft Shenanigans win again.

Create a standalone report from the SharePoint 2010 Web Analytics Service App

I did it – albeit with a lot of help from a variety of folks. I successfully created an SSRS report that queries the SharePoint Web Analytics database. If you don’t have SSRS, then there’s always PowerPivot. The queries will work with both. Here’s how I did it.

First, you’re going to need the Site Aggregation ID. Why Microsoft uses this is beyond me. I ran across a blog post where the writer wrote a console app to extract this (blog post HERE). Tried the app with no luck. I then posted THIS on TechNet. Thanks to Guru Karnik for getting me on the right path.

The only catch is Guru’s query uses CURSORs. According to Tim Laqua this is very bad, and I’m inclined to listen to him (you’ll notice our blogs look eerily similar – yes I used the same layout). So what to do? Well we first need the Aggregation ID. I grabbed this portion of Guru’s query and fired away:

SELECT DISTINCT DimensionName as SiteCollectionID,AggregationId,SM.[Path]

FROM WASiteInventorySnapshot WASIS WITH (NOLOCK)

INNER JOIN [SPConfigDB].[dbo].[SiteMap] SM

ON WASIS.DimensionName = SM.Id

WHERE WASIS.DimensionType=0

Order by Path

Take a look at your results and you now have the Site Collection GUID, the AggregationID and the URL path. Filter on the site collection you’re after and grab the Aggregation ID. If you’re not sure which site collection you want to grab (as was the case for me), you can grab the site collection id and use PowerShell to show you the way like this:

image

image

OK, AggregationID in hand, we place it into the following query:

USE [WebAnalytics_ReportingDB]

   --Declare everything

DECLARE @SiteId UniqueIdentifier

DECLARE @AggregationId UniqueIdentifier

DECLARE @SitePath NVarchar(255)

DECLARE @StartDate Date

DECLARE @EndDate Date

DECLARE @DateDiff Int

  --Set Variables; @datediff is the amount of days you're after

SET @DateDiff = 30

SET @StartDate = DATEADD(d,-1*@DateDiff,GETDATE())

SET @EndDate = GETDATE()

SET @AggregationID = '[paste aggregationid here]'

SET @SiteID = '[paste siteid here]'

SET @SitePath = '[paste URL here]'

  --Make Magic

SELECT @SiteId AS 'SiteId',@AggregationId AS 'AggregationId',@SitePath AS 'SitePath',@StartDate 'StartDate',@EndDate 'EndDate',

* FROM [WebAnalytics_ReportingDB].[dbo].[fn_WA_GetSummary] (Cast(CONVERT(varchar(8),@EndDate,112)as int),Cast(CONVERT(varchar(8),@StartDate,112)as int),@DateDiff,@AggregationId,1)

You’ll notice I took Guru’s query and further tweaked it for my use. Couple of notes here:

  • The GetSummary function the query is calling is a little jacked. You need to set a Start Date, End Date, and a Date Difference (i.e. the difference between the 2 dates). Why? Because that’s the variables the function wants. Like I said…jacked. Get’s worse below.
  • For me, I wanted the SSRS report to always display the last 30-days starting with Today’s date. Later I can always go back and add some sort of parameter/expression magic to make the dates customizable by the user, but for now this works.
  • Notice in the Set Variables section of the query that I’ve included the SiteID and SitePath. I did this for posterity only. It has no other bearing on the query other than making it available for me when I create the report.
  • I first declare the date fields as Date so that I can do some magic with them in terms of calculating the dates programmatically, but later on I have to convert them to Int so the function will work properly. Very jacked, but hey, it works.

But what about Top users? Got that too:

USE [WebAnalytics_ReportingDB]

   --Declare everything

DECLARE @SiteId UniqueIdentifier

DECLARE @AggregationId UniqueIdentifier

DECLARE @SitePath NVarchar(255)

DECLARE @StartDate Date

DECLARE @EndDate Date

DECLARE @DateDiff Int

  --Set Variables; @datediff is the amount of days you're after

SET @DateDiff = -30

SET @StartDate = DATEADD(d,@DateDiff,GETDATE())

SET @EndDate = GETDATE()

SET @AggregationID = '[paste aggregationid here]'

SET @SiteID = '[paste siteid here]'

SET @SitePath = '[paste URL here]'

  --Make Magic

SELECT @SiteId AS 'SiteId',@AggregationId AS 'AggregationId',@SitePath AS 'SitePath',@StartDate 'StartDate',@EndDate 'EndDate',

* FROM [WebAnalytics_ReportingDB].[dbo].[fn_WA_GetTopVisitors] (Cast(CONVERT(varchar(8),@StartDate,112)as int),Cast(CONVERT(varchar(8),@EndDate,112)as int),@AggregationId,1)

Similar to the query above, only this time I call a different function: GetTopVisitors. Again, had to do some magic with the Start and End Dates to calculate programmatically.

This is only the beginning. Lot’s of possibilities here. The downfall to this approach is that it’s site specific. There is a function that will show top pages, but you can’t get down to the specific web (i.e. sub-site).

Next iteration will allow users to input custom dates and we’ll get creative with the graphs and charts in SSRS (think: dancing kittens and rainbows style). Thanks to everyone that gave me a hand in this.

Solving Double authentication prompt with Document Libraries in SharePoint for Internet Sites

Ran into this issue while I was on vacation. Dawn loved that while I’m in paradise, I break out my laptop to diagnose the issue. That took some serious explanation.

Anyways, I have an internet site deployed on SharePoint For Internet Sites (FIS). Although the site is setup for anonymous users we have a secured document library using a custom claims provider. The unfortunate thing I’ve found is that Office docs are not claims aware. What this means is that a user could login to the site using the custom claims provider, navigate to the doc library, click on a document, and they’re hit with another authentication prompt:

clip_image001

But they’re already logged in. What gives!

One thing to keep in mind with the double auth is that the user will only be required to do this once as it appears a cookie is cached in Office, but the particular site in question users typically only open one document at a time and then go on their merry way. So the double authentication is not gonna fly.

Workaround: use _layouts/download.aspx

I created a separate links list and created links to each file. The syntax you should use is as follows:

http://site.com/_layouts/download.aspx?SourceURL=%5Blibrary name]/[file name.extension]

Using the download.aspx convention causes SharePoint to use a different web service to deliver the file to the user. It adds a few steps for the admins but I’d rather make my life harder than make the site user’s life hard. If anyone knows a different way on how to solve this problem I’m all ears.

SQL Query Basics

Needless to say, I’m a SQL n00b. For the longest time I drank the Microsoft Kool-aide and believed “you shouldn’t touch the SQL side of SharePoint.” And when I say “shouldn’t touch” I mean absolutely no touching or querying of any kind. Well, I’ve wised up. It’s not that you can’t touch anything, just be mindful of what you can and can’t do. Querying – as long as you do it with no lock – will not adversely impact anything. And above all else, you must have a healthy respect for what you’re doing lest you decide to anger the SharePoint Gods. This post is for all my peers out there that do a little SP Admin work on the side, but – for a variety of reasons – don’t get into the SQL side of things much.

SQL Management Studio is the best (and only?) tool used to query SQL. It’s a pretty straightforward, easy-to-use product. Open it up, connect to a database, and click the “New Query” button:

image

Pretty simple. From here you can query SQL (i.e. “ask” SQL things):

USE [database name]

select * from [database table]

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

The Use statement dictates which database you want to query. There is a dropdown near the “New Query” button, but the Use statement is so much easier to use. Plus, I’ve – on many occasions – forgot to select the correct database from the dropdown. I just use the Use statement from now on. “Select *” means you want to return all the columns in that table. This can be helpful to use when you just want to poke around and learn about SharePoint’s innards.

Use [database name]

select * from [database table] with(nolock)

 

“with(nolock)” is important because it doesn’t lock up the table while you’re querying it. I again learned this one the hard way. Locking up tables is a bad thing, meaning no one else can use said table until the query is done. Don’t do what I did and lock up the content database. Does not make for good job security. Furthermore, it’s generally a good idea to run just about any query (at least in the case of SharePoint) with no lock on it.

One last query nugget of wisdom. Replace the * in the select statement with column names (comma separated). So for example:

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

Use [database name]

select LastName,FirstName,Address from [database table] with(nolock)

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

 

Your query will return the results with just the Last Name, First Name, and Address columns. Doing so will help clear the query results up and remove any extraneous data.

These 3 basic queries – while n00b’ish – are very valuable to know and will hopefully help you when working with Management Studio.

#TechOnTap review

So that whole promise I made earlier in the week to write everyday? Yeah, didn’t work out so well. Sorry…

Yesterday I gave my PowerPivot presentation at #TechOnTap up in Appleton, WI. This is a very cool IT speaker series that takes place at the Stone Cellar Brewpub. I highly recommend both #TechOnTap and the Stone Cellar Brewpub. Together they’re an out-of-this-world experience.

Chuck Heinzelman started things off with a nice overview of Kerberos in SharePoint. The use case scenario was around SharePoint 2010 & SQL 2012 SSRS…something I’ve been testing off-and-on for the last 4-months or so. I’ve got it 90% of the way and (I’m hoping) Chuck filled in the remaining 10%. When I get a free minute in the next week or two (yeah right) I’ll see if I can finally fill in the remaining gaps and get Kerberos going all the way (blog post to follow that).

Rob Bogue was up next with an overview of Forms based authentication. I’ve been busy at work putting together an Internet site on SharePoint and I’ve had to blaze this trail before. Let’s just say it’s a somewhat confusing prospect. Rob is always an entertaining presenter.

Rick Fischer rounded things out with a talk on InfoPath. Nicely done overview of InfoPath’s functionality. There’s a lot of good stuff in the product and it can open up a lot of possibilities for an organization. In my mind, InfoPath is to SharePoint like Outlook is to Exchange.

I was the day’s headliner (i.e. I went last). Had some awesome questions from the audience. The demo always seems to blow minds. It’s that type of reaction that makes me enjoy speaking about PowerPivot.

Lunch was plentiful (sandwich bar) and the beer was even better. Did I forget to mention it was all you could drink? The next event will be in October and you can bet I’ll be attending. I hope to bring some of my Trek cohorts up there with me too.

Thanks to the 3 “brewmasters” for throwing #TechOnTap and inviting me to speak: Derek Schauland, Jes Borland, and Mark Cyrulik. Very cool people indeed. Went to dinner afterwards with Derek, Mark, and Tim Florek. Had a good time and learned even more that the Appleton area has an extremely tight-knit group of IT practitioners/users. It’s so refreshing to see a group root for each other so much. Madison has similar groups, but nothing consolidated and cohesive like they do up in Appleton. Awesome sauce.  

#SPSSTL recap

I made a promise to myself to start writing a little everyday. Just 20 minutes every day to get something down on “paper.” I’m long out of writing practice and doing this will help a lot of things. But enough about that, let’s talk about SharePoint Saturday St. Louis!

My wife and I had originally planned to drive down Friday, but we came up with this wild idea to drive down Thursday night instead. So with the car packed and 2 little kids in tow, we left at 5:30 PM. Word to the wise… I don’t recommend starting a 7-hour car ride with two children under 2 that late. Everything went fine until hour 5, then all hell broke loose. I’ll spare you the gory details, but let’s just say my 11-month old had had enough of riding in the car. Family drama aside we made it fine to my father-in-laws just fine.

Friday night was the speaker’s dinner at the Moonrise Hotel. VERY nice. The food was fantastic, lots of new faces for me, and they even had an open bar. The more I go to these things the more I begin to notice that SharePoint Saturdays are very much a local thing. You’ll have the occasional speaker come from far away, but for the most part, the folks speaking and attending are local. Most of the folks I spoke to on both days thought it was crazy that I would come all the way from Wisconsin to attend #SPSSTL, but when I tell them my wife’s father-in-law lives close by they didn’t think I was so crazy.

On Saturday morning I noticed that my CloudShare environment PowerPivot wasn’t refreshing correctly. I spent 95% of Danny Jessee’s presentation on Facebook, Cloud, and SharePoint in the back of the room trying to fix things. By the 1-hour mark I decided to go with my backup environment. Wasn’t difficult to get up and running, but the whole ordeal was a HUGE pain. From what I did see of Danny’s presentation it was pretty slick.

Next presentation was Virgil Carroll’s on Information Architecture in SharePoint. Honestly, this guy could enthrall me just by reading the phone book. I saw him last year at the Twin Cities SharePoint Saturday and he was awesome. If you get a chance to see him I highly recommend it if for nothing else to see his presentation style.

Lunch. I was too busy talking to vendors that the catering vendor (St. Louis Bread Company; Panera to those folks who live outside the area) ran out of box lunches. So I was stuck with a bagel for lunch. Oh well.

Next up was Enrique Lima’s presentation on SQL best practices. SQL is still very much black magic to me in some ways and Enrique removed some of that mysticism for me. You can check out his presentation HERE.

The last session I saw before mine was Todd Kitta’s BI presentation. Now I’d be lying to you if I told you I wasn’t nervous that he would cover every point in my presentation. Thankfully he did not (whew!). Probably the best part of the weekend for me was when he covered Power View from a 100K ft level. I haven’t had a chance to test it yet in my own dev environments so it was nice to get a view of it. I didn’t know that Power View and PowerPivot work pretty closely with one another (i.e. you can take a working PowerPivot and turn it in to a Power View report). I was also curious to find out how you build Power View reports. I was somewhat disappointed to learn that there is yet another app needed to build Power Views; however, it’s all browser based and it’s pretty simple to use. So I’d call that a wash. Good job Todd!

Finally it was time for my presentation. I learned a valuable lesson right from the start of my presentation. When presenting to SQL Saturday’s, the title “Demystifying PowerPivot from the SharePoint Admin’s perspective” means something entirely different than it does to a SharePoint Saturday audience. Had about 12 or so people come. Not bad, but definitely not standing room only like it was at SQL Saturday back in April. My presentation caters to just about anyone, but the folks who did come appeared to get a fair amount of information out of it. At the #SharePint afterwards I talked to a couple of folks who – once I gave them the background on my presentation – were bummed they missed it. Definitely violated the first rule of anticipating your audience. I guarantee I won’t make that mistake again. Any ideas on a new title?

Overall, I had a good time in St. Louis. #SharePint was held at Moonrise again up on their rooftop bar. Certainly a hidden gem in St. Louis that I would recommend to anyone visiting STL this summer. You can find my slides to my presentation at the following: PowerPivotSPSSTL.pptx. Thanks!

SQL Saturday #118 was a success

So I admit that I’ve been slacking on my blog and Twitter posts, but I swear it was for a good reason. I was preparing for my presentation at SQL Saturday #118 here in Madison this past Saturday.

My presentation was entitled “Demystifying PowerPivot from the SharePoint Admin’s Perspective.” Feedback was overwhelmingly positive. Had a great crowd and they asked some fantastic questions.

I’ve had several inquiries about my slidedeck. Ask and ye shall receive! PowerPivotSQLSAT118.pptx

I also be posting the slides to the PASS site as well.

I’ve been asked to give this (or similar) presentation at Tech on Tap & MADPASS. There are a whole handful of other possibilities for this presentation too (i.e. SharePoint Saturday Chicago and/or St. Louis, SPUG’s, and maybe even another SQL Saturday). I’m all about getting out there and spreading the PowerPivot love.

Thanks again to everyone that came out.