Get count of objects in a directory with PowerShell

A quick post!

Today I started migrating content from an on-premises source to SharePoint Online using ShareGate’s Migration tool (AWESOME TOOL by the way!). I’m about 2 hours in to the migration and I was curious how much content I had to go. I need to get a count of all the objects in the directory. PowerShell to the rescue:

$path = "[paste directory path here]"
$files = gci $path -recurse | Measure-Object
$files.Count

You’ll get a count of all the objects in the directory. This number will help me gauge how long the migration will take. Happy migration!

Configure PowerPhone with Cisco phones

We’ve started rolling out PowerObjects’ PowerPhone. Users love the functionality and it’s lead to more CRM use overall.

We researched a few different CTI solutions. Getting it setup in CRM is a breeze, but when it comes to configuring things on user computers it’s actually harder than I originally thought. It took us about two months and too many hours to figure out. So I’m sharing the necessary steps so you don’t have to feel the same pain.

YESSSS

There are assumptions we have to make before we proceed:

Assumptions

  • You are on Cisco’s UCM (Unified Communications Manager) platform 9.0 or above.
    • UCM 9 supports Windows 7
    • UCM 10 and 10.5 support Windows 8.1
  • You have already downloaded the TAPI install files from UCM – (Note: there is no other way I know of to get the install files)
  • You have already uploaded the PowerPhone solution in your CRM instance and enabled it – found HERE.

Steps for installing the PowerPhone agent on user computers

  1. Double click the TAPI installation (be sure to choose the correct version – x86 vs x64). Note: Just “next” the whole way through
  2. If not installed, install .NET framework 4.5
  3. A restart may be required at this point
  4. Download the PowerPhone Agent from HERE
  5. Open the zip file and copy the entire “PowerPhone_1_3_2_5_agent” directory to the user’s C: drive
  6. Once copied, open the directory and pin the PowerPhone.exe file to the task bar. This will help the user when logging in to the phone.
  7. Log in to Unified Communications Manager
  8. Go to User Management > End User
  9. Search for the User. Once found open the end user configuration by clicking on their name
  10. Scroll to the bottom and click “Add to Access Control Group”Add to Access Control
  11. A pop-up will open, search for “Begins with ‘Standard CTI'”Choose Role
  12. Assign the user the “Standard CTI Enabled” role
  13. Hit Save on the End User Configuration record
  14. Go to Start > Launch TSP
  15. Right click the TAPI driver in the notification area and choose “Cisco TAPI configuration”
  16. Once open, make sure the instance is selected and choose “Configure”Configure
  17. On the User tab, have the user enter their Cisco username and password – if you have LDAP you would enter your AD credentials hereLogin
  18. On the CTI Manager tab, enter the IP address(es) of your Cisco serversIPs
  19. Make sure the user has PowerPhone rights in CRM before proceeding
  20. Open PowerPhone and configure the connection:PowerPhone1
  21. Select the phone line and add in the connection info to your CRM instance. If the phone line doesn’t appear in the dropdown then something hasn’t been configured properly with the TAPI driver.
  22. Once connected, click on settings and set the outgoing number (9 in our case)PowerPhone2
  23. Last step (at least for us) we had to make an outgoing call from PowerPhone so the TAPI driver will make the proper connection to the phone system. YMMV on this.

Powershell hack with IE

I wrote this because I wondered if it could be done. I figured it would be pretty difficult to come up with a Powershell hack with IE but as it turned out it took just a little bingling and this is what I came up with:

start 'http://www.inhifistereo.com'

Start-Sleep -s 5

Get-Process iexplore | Foreach-Object { $_.CloseMainWindow() }

exit

The script opens the browser, waits 5 seconds, then closes the page. Pretty simple I know but I’ll be expanding on this script more in the future. Some possibilities/uses could be:

  • Check a web page for content then send an email depending on what’s found
  • Keep VPN connection open
  • Check if a website is up

I went even farther as to add it to the Task Scheduler and have it run every 30 minutes. There are plenty of blog posts out there outlining how to create a Task in Task Scheduler, but the small hiccup I ran in to was how to run Powershell from the Scheduler.

When you get to the Actions tab add the path to Powershell:

C:WindowsSystem32WindowsPowerShellv1.0powershell.exe

The “Add arguments” section is where you add the command and path to your .ps1 file:

-Command “& [path to your file without brackets]”

The possibilities are pretty endless here. I don’t know if anyone else will find this useful but I know I will use it.

Add a clock to your web page

Had an interesting request to add a clock to a web page yesterday. Turned out to be a bit more difficult than we thought. We used moment.js for this project. It’s a VERY cool javascript library that helps you with time. If you’ve never done work with javascript and time consider yourself lucky. It’s downright hard. Moment.js takes the guesswork out – although there’s a little bit of a learning curve. Let’s get started.

First, add a div to your page and give it an id.

Now, we add our script in the footer of the html.

//call jquery and moment.js libraries
http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js
    http://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js


//function to create clock functionality. updates every second.
        function timedUpdate() {
            showClock();
            setTimeout(timedUpdate, 1000);
        }
//moment.js function. get current time > pass to moment function > create html
        function showClock() {
            var now = new Date().getTime();
            var test = moment(now).format('MMMM Do YYYY, h:mm:ss a');

//create html
            $("#clock").html(test);
        }

//call update
        timedUpdate();
    

And then you get a nice clock to work with.
http://www.inhifistereo.com/wp-content/uploads/2014/02/Clock.html

PowerPivot using large SSRS ATOM feed fix

I’ve talked on, blogged about, and Tweeted about PowerPivot for awhile now. By itself, PowerPivot is pretty cool, but add it on to SharePoint and you have the Voltron of BI solutions.

PowerPivot Voltron

I can pull from Oracle, Excel, a CSV, and DB2 all in the same file. CRAZY! What’s even cooler is that I can pull a SSRS ATOM feed in to PowerPivot . . . sometimes.

By default – if you’re using SQL 2012 Integrated mode – the largest SSRS ATOM feed you can pull is 110 MB. We have some pretty ridiculously large reports at Trek and a user was attempting to pull one of these monsters in to PowerPivot and it was choking. To make matters worse, PowerPivot was just throwing a 500 error (Unknown error). Not really helpful…

I opened a MS Case and began troubleshooting. We went back and forth and back and forth (6 weeks!), but we finally found the solution. So to get PowerPivot using large SSRS ATOM feeds do the following:

  1. Open the file Client.Config on your front end servers. The File is located here: C:/Program Files/Common Files/microsoft shared/Web Server Extensions/15/WebClients/Reporting/client.config
  2. Search for “httpStreaming” and “httpsStreaming”
  3. Within both these Bindings, change the value of the following – this will increase the Data Size from 110 Mb to 1.1 Gb:
    1. maxReceivedMessageSize from “115343360” to “1153433600″
    2. maxStringContentLength from “115343360” to “1153433600″
    3. maxArrayLength from “115343360” to “1153433600″
  4. Save the File
  5. Do an IISRESET across all SharePoint Servers.

Happy PowerPivot’ing.

Share to Yammer button

Had an interesting request a few weeks ago. HR wanted to be able to share pages out to Yammer. I instantly ruled out a workflow because I couldn’t post as the user. Then ruled out a console app for 2 different reasons: the logic could get complicated and it didn’t allow for much flexibility. I went over to the Yammer Customer Network and started poking around. I ran across a few threads that talked about using the bookmarklet so I went and took a look at it: https://www.yammer.com/company/bookmarklet

The app – as designed – is aimed at users who want to share pages via their browser, not necessarily on a web page itself. I opened the developer toolbar and hashed through the code. I managed to find the JavaScript that snags the URL and opens the bookmarklet window. As Steve would say, “Talk is cheap, show me the code”:

<h2>Share this page . . . </h2>

The code will give you the share to yammer button:

Share it with Yammer

YAHTZEE!

Yes!

Once the user clicks the icon a new window will open and – as long as they’re logged in to Yammer – they’ll be able to create a new Yam with the URL of the page they want to share in the update’s body.

Bookmarklet Window example

Give it a shot and let me know what you think.

Azure Storage test drive

For the last year and a half I’ve been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I’ve basically learned all this sysadmin stuff “on the job.” I figured it would be a good idea to fill-in-the-blanks for the stuff I didn’t learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.

So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?

TO THE CLOUD!

I’ve been using Azure at work for a variety of things so I figured I’d give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.

There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.

The storage account sets up the subdomain you’ll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.

Once your storage account is up, you’ll need a container. Think of a container as a folder – only it’s not a folder – it’s a container. It holds your blobs – binary large object (i.e. your files). More on that in a bit.

Click Storage in the left-hand navigation

Click on the storage account name (my account is called “inhifistereo,” you can call your’s whatever you like)

Click containers at the top of the page > then click New at the bottom of the page


Now give the container a name and choose Public or Private

Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that’s fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we’re good to go.

So a container is a container – like a folder, only it’s a container. And a blob is a file. The part that took me a second to understand is this storage isn’t like a fileshare up in the cloud. It’s the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big ‘ole file that sits within the container.

Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: . There are other tools too:

The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I’m not doing this for a living or every day so the time is no big deal. I’m charged by the GB not the minute, so if it took several days no biggie. But I’m not getting any younger…

Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.

“But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?” Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don’t get charged anymore. I don’t plan on ever using these backups so they’re cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren’t really setup for consumer usage. But I’m not your typical consumer.

I’ll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I’m feeling really ambitious I may try doing a console app.

Price-wise, I’ve been charged a total of 15 US cents so far. I may have to go raid the couch cushions…