PowerPivot using large SSRS ATOM feed fix

I’ve talked on, blogged about, and Tweeted about PowerPivot for awhile now. By itself, PowerPivot is pretty cool, but add it on to SharePoint and you have the Voltron of BI solutions.

PowerPivot Voltron

I can pull from Oracle, Excel, a CSV, and DB2 all in the same file. CRAZY! What’s even cooler is that I can pull a SSRS ATOM feed in to PowerPivot . . . sometimes.

By default – if you’re using SQL 2012 Integrated mode – the largest SSRS ATOM feed you can pull is 110 MB. We have some pretty ridiculously large reports at Trek and a user was attempting to pull one of these monsters in to PowerPivot and it was choking. To make matters worse, PowerPivot was just throwing a 500 error (Unknown error). Not really helpful…

I opened a MS Case and began troubleshooting. We went back and forth and back and forth (6 weeks!), but we finally found the solution. So to get PowerPivot using large SSRS ATOM feeds do the following:

  1. Open the file Client.Config on your front end servers. The File is located here: C:/Program Files/Common Files/microsoft shared/Web Server Extensions/15/WebClients/Reporting/client.config
  2. Search for “httpStreaming” and “httpsStreaming”
  3. Within both these Bindings, change the value of the following – this will increase the Data Size from 110 Mb to 1.1 Gb:
    1. maxReceivedMessageSize from “115343360” to “1153433600″
    2. maxStringContentLength from “115343360” to “1153433600″
    3. maxArrayLength from “115343360” to “1153433600″
  4. Save the File
  5. Do an IISRESET across all SharePoint Servers.

Happy PowerPivot’ing.

Share to Yammer button

Had an interesting request a few weeks ago. HR wanted to be able to share pages out to Yammer. I instantly ruled out a workflow because I couldn’t post as the user. Then ruled out a console app for 2 different reasons: the logic could get complicated and it didn’t allow for much flexibility. I went over to the Yammer Customer Network and started poking around. I ran across a few threads that talked about using the bookmarklet so I went and took a look at it: https://www.yammer.com/company/bookmarklet

The app – as designed – is aimed at users who want to share pages via their browser, not necessarily on a web page itself. I opened the developer toolbar and hashed through the code. I managed to find the JavaScript that snags the URL and opens the bookmarklet window. As Steve would say, “Talk is cheap, show me the code”:

<h2>Share this page . . . </h2>

The code will give you the share to yammer button:

Share it with Yammer

YAHTZEE!

Yes!

Once the user clicks the icon a new window will open and – as long as they’re logged in to Yammer – they’ll be able to create a new Yam with the URL of the page they want to share in the update’s body.

Bookmarklet Window example

Give it a shot and let me know what you think.

Azure Storage test drive

For the last year and a half I’ve been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I’ve basically learned all this sysadmin stuff “on the job.” I figured it would be a good idea to fill-in-the-blanks for the stuff I didn’t learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.

So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?

TO THE CLOUD!

I’ve been using Azure at work for a variety of things so I figured I’d give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.

There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.

The storage account sets up the subdomain you’ll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.

Once your storage account is up, you’ll need a container. Think of a container as a folder – only it’s not a folder – it’s a container. It holds your blobs – binary large object (i.e. your files). More on that in a bit.

Click Storage in the left-hand navigation

Click on the storage account name (my account is called “inhifistereo,” you can call your’s whatever you like)

Click containers at the top of the page > then click New at the bottom of the page


Now give the container a name and choose Public or Private

Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that’s fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we’re good to go.

So a container is a container – like a folder, only it’s a container. And a blob is a file. The part that took me a second to understand is this storage isn’t like a fileshare up in the cloud. It’s the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big ‘ole file that sits within the container.

Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: . There are other tools too:

The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I’m not doing this for a living or every day so the time is no big deal. I’m charged by the GB not the minute, so if it took several days no biggie. But I’m not getting any younger…

Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.

“But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?” Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don’t get charged anymore. I don’t plan on ever using these backups so they’re cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren’t really setup for consumer usage. But I’m not your typical consumer.

I’ll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I’m feeling really ambitious I may try doing a console app.

Price-wise, I’ve been charged a total of 15 US cents so far. I may have to go raid the couch cushions…