Add a clock to your web page

Had an interesting request to add a clock to a web page yesterday. Turned out to be a bit more difficult than we thought. We used moment.js for this project. It’s a VERY cool javascript library that helps you with time. If you’ve never done work with javascript and time consider yourself lucky. It’s downright hard. Moment.js takes the guesswork out – although there’s a little bit of a learning curve. Let’s get started.

First, add a div to your page and give it an id.

<div id="clock"></div>

Now, we add our script in the footer of the html.

//call jquery and moment.js libraries
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
    <script src="http://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js"></script>

<script>
//function to create clock functionality. updates every second.
        function timedUpdate() {
            showClock();
            setTimeout(timedUpdate, 1000);
        }
//moment.js function. get current time > pass to moment function > create html
        function showClock() {
            var now = new Date().getTime();
            var test = moment(now).format('MMMM Do YYYY, h:mm:ss a');

//create html
            $("#clock").html(test);
        }

//call update
        timedUpdate();
    </script>

And then you get a nice clock to work with.

sharepointwookiee.com is now inhifistereo.com

Living in Wisconsin we get snowed in from time to time. It’s nowhere near as bad as this Russian town (Real Russian Winter) but this weekend we got about 6 inches. With ample time on my hands I figured it was time to do some internet housecleaning.

Consolidated Twitter handles

Managing @spwookiee and @inhifistereo was a lot. I have used inhifistereo as a handle since the days of AOL so I feel a pretty tight attachment to it. But spwookiee had the following. Did a swap similar to THIS and I was off and running.

Renamed Blog

Along the same lines as the Twitter handle swap, just wasn’t feeling sharepointwookiee.com. I had always planned to do something Star Wars-esque with the site but never got around to it. Additionally, I’m doing more BI and CRM work these days on top of my SharePoint work. Felt kind of weird posting CRM stuff on a blog that had SharePoint in the title. Plus, choosing another domain made the next change a little easier.

Goodbye GoDaddy, Hello WordPress on Azure

I use Windows Azure a lot at work. I fall in love a little more every time I log in to the portal. So why not use wordpress on azure for my blog hosting too? I get way more control and it’s a little cheaper too.

I wanted completely out of GoDaddy’s grip so I transferred my domains to gandi.net. I’m way happier because I get everything included in the purchase price with gandi.net that GoDaddy charges ala carte for. So take that GoDaddy:

Pros and Cons

As with any decision, there are pros and cons. I listed most of the pros above. I can think of 2 Cons. 1) Dip in viewership/following. New URL means folks will have to update their RSS readers. Hopefully I can find a way to redirect. 2) New tech. I do use Azure fairly often, but I don’t know everything about it. I’ll assume there will be a learning curve of some sort.

The whole switch took me about 8 hours over the course of 2 weeks. I did a little here and there when I had the time. But the biggest chunk happened over this weekend. Pretty happy with the results so far.

PowerPivot using large SSRS ATOM feed fix

I’ve talked on, blogged about, and Tweeted about PowerPivot for awhile now. By itself, PowerPivot is pretty cool, but add it on to SharePoint and you have the Voltron of BI solutions.

PowerPivot Voltron

I can pull from Oracle, Excel, a CSV, and DB2 all in the same file. CRAZY! What’s even cooler is that I can pull a SSRS ATOM feed in to PowerPivot . . . sometimes.

By default – if you’re using SQL 2012 Integrated mode – the largest SSRS ATOM feed you can pull is 110 MB. We have some pretty ridiculously large reports at Trek and a user was attempting to pull one of these monsters in to PowerPivot and it was choking. To make matters worse, PowerPivot was just throwing a 500 error (Unknown error). Not really helpful…

I opened a MS Case and began troubleshooting. We went back and forth and back and forth (6 weeks!), but we finally found the solution. So to get PowerPivot using large SSRS ATOM feeds do the following:

  1. Open the file Client.Config on your front end servers. The File is located here: C:/Program Files/Common Files/microsoft shared/Web Server Extensions/15/WebClients/Reporting/client.config
  2. Search for “httpStreaming” and “httpsStreaming”
  3. Within both these Bindings, change the value of the following – this will increase the Data Size from 110 Mb to 1.1 Gb:
    1. maxReceivedMessageSize from “115343360″ to “1153433600″
    2. maxStringContentLength from “115343360″ to “1153433600″
    3. maxArrayLength from “115343360″ to “1153433600″
  4. Save the File
  5. Do an IISRESET across all SharePoint Servers.

Happy PowerPivot’ing.

Share to Yammer button

Had an interesting request a few weeks ago. HR wanted to be able to share pages out to Yammer. I instantly ruled out a workflow because I couldn’t post as the user. Then ruled out a console app for 2 different reasons: the logic could get complicated and it didn’t allow for much flexibility. I went over to the Yammer Customer Network and started poking around. I ran across a few threads that talked about using the bookmarklet so I went and took a look at it: https://www.yammer.com/company/bookmarklet

The app – as designed – is aimed at users who want to share pages via their browser, not necessarily on a web page itself. I opened the developer toolbar and hashed through the code. I managed to find the JavaScript that snags the URL and opens the bookmarklet window. As Steve would say, “Talk is cheap, show me the code”:

<h2>Share this page . . . </h2>

<div align="center">

<a href="javascript:var d=document,w=window,e=w.getSelection,k=d.getSelection,x=d.selection,s=(e?e():(k)?k():(x?x.createRange().text:0)),f= 'https://www.yammer.com/home/bookmarklet',l=d.location,e=encodeURIComponent,p='?bookmarklet_pop=1&v=1&u='+e(l.href)%20+'&t='+e(d.title.replace(/^ *| *$/g,''))%20+'&s='+e(s),u=f+p;a=function()%20{if%20(!window.open(u,'sharer','toolbar=0,status=0,resizeable=1,width=650,height=550'))l.href=f+p};if%20(/Firefox/.test(navigator.userAgent))setTimeout(a,0);else{a()}void(0);">

<img src="https://c64.assets-yammer.com/images/clients/bookmarklet_icon.png" border=0 />

</a>

</div>

The code will give you the share to yammer button:

Share it with Yammer

YAHTZEE!

Yes!

Once the user clicks the icon a new window will open and – as long as they’re logged in to Yammer – they’ll be able to create a new Yam with the URL of the page they want to share in the update’s body.

Bookmarklet Window example

Give it a shot and let me know what you think.

Azure Storage test drive

For the last year and a half I’ve been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I’ve basically learned all this sysadmin stuff “on the job.” I figured it would be a good idea to fill-in-the-blanks for the stuff I didn’t learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.

So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?

TO THE CLOUD!

I’ve been using Azure at work for a variety of things so I figured I’d give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.

There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.

The storage account sets up the subdomain you’ll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.

Once your storage account is up, you’ll need a container. Think of a container as a folder – only it’s not a folder – it’s a container. It holds your blobs – binary large object (i.e. your files). More on that in a bit.

Click Storage in the left-hand navigation

Click on the storage account name (my account is called “inhifistereo,” you can call your’s whatever you like)


Click containers at the top of the page > then click New at the bottom of the page


Now give the container a name and choose Public or Private

Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that’s fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we’re good to go.

So a container is a container – like a folder, only it’s a container. And a blob is a file. The part that took me a second to understand is this storage isn’t like a fileshare up in the cloud. It’s the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big ‘ole file that sits within the container.

Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: http://azurestorageexplorer.codeplex.com/. There are other tools too:

The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I’m not doing this for a living or every day so the time is no big deal. I’m charged by the GB not the minute, so if it took several days no biggie. But I’m not getting any younger…

Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.

“But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?” Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don’t get charged anymore. I don’t plan on ever using these backups so they’re cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren’t really setup for consumer usage. But I’m not your typical consumer.

I’ll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I’m feeling really ambitious I may try doing a console app.

Price-wise, I’ve been charged a total of 15 US cents so far. I may have to go raid the couch cushions…

CRM + SharePoint * Excel Services = Epic Awesomeness

What’s the best way to get someone to eat their vegetables? Force feed them ;)

All kidding aside, CRM can be a pretty powerful tool and when people don’t want to use it we have to find creative ways to get them to use CRM. In addition to CRM usage, we have people who are absolutely married to Excel. And these aren’t your typical Excel files. These are files that legends are made of. Crazy formulas, vlookups galore; you name it, we use it. To make matters worse, they e-mail these Excel reports around all day, every day as attachments. So let’s kill two birds with one stone. We stop a big group of people from e-mailing Excel attachments and we get them to use CRM. Win-win for everyone (or at least that’s the hope).

  1. Save Excel file in an easy to find, easy to access place in SharePoint – doing this in SharePoint gives us all the doc mgmt benefits that we’ve come to know and love
  2. Configure your Excel REST API URL – I’ve made it pretty clear that I love SharePoint’s REST APIs in the past and the Excel API is no exception. You can read more about it here: http://msdn.microsoft.com/en-us/library/ee556413(v=office.14).aspx
  3. Create a new Web Resource in CRM – we’re going to iframe our Excel REST API URL call
  4. Choose Web Page (HTML) as the Type and then click “Text Editor”
  5. Click the Source tab
  6. Paste in your iframe code between the <body> tags. Your code should look like this:
    <iframe src=https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html frameborder="0" width="4000px" height="1300px"></iframe>
    • Note the frameborder, height, and width attributes. These are needed to eliminate the nasty border and to make scrolling work correctly. iframe’s aren’t perfect and getting them to work feels “hacky” but the user won’t know the difference and it should perform relatively seamless in all browsers.
  7. Click Publish
  8. Now, navigate to the desired dashboard and add your new Web Resource, click Save, and Publish.

Users should now see the Excel spreadsheet in their dashboard:

If users do not have access to the spreadsheet they should encounter an Error:Access Denied prompt or a blank screen depending on the browser they use.

Extra Credit

In our case, the Excel spreadsheet scrolled FOREVER. I wanted to give users a pleasurable experience but I also didn’t necessarily want them resorting to Excel on the client right away. I added a “Click to View in a separate Window” link in the iframe Web resource. Here’s what my code looked like:

<p><a href="https://dude.crm.dynamics.com/WebResources/new_iframe2">Click to View in separate Window</a></p><iframe src=https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html frameborder="0" width="4000px" height="1300px"></iframe>

All HTML Web Resources are web pages, so I linked directly to the web page. But notice I linked to new_iframe2? I didn’t want users seeing “Click to View” on every page so I made an identical web resource, except I removed the hyperlink from the top, making for a seamless experience for the user. There’s all sorts of other things I could have done on the new_iframe2 page. I could have linked to the Excel Web Access or even directly to Excel itself, but we’ll leave it like that for now.

Ultimately, I’ve gotten the report builders to stop e-mailing this specific report as an attachment, and now the audience of the spreadsheet has to go to CRM to view it rather than getting it e-mailed to them. Awesome.

Obligatory SharePoint 2013 Search Powershell post

Maybe you’re still kicking the tires on SharePoint 2013 installing it for the first time. Maybe you’re in the throws of planning your migration. Maybe you’re a consultant who’s been stuck on a SharePoint 2010 project for the last 18 months. Either way, we all have to face the music sooner or later and upgrade to SharePoint 2013. When you do, you’ll have to setup a Search Service. It’s not as bad as you’d think. And if you have at least 3 servers in your farm (1 app and 2 WFEs) then this script will work for you. Without further delay:

#Config Section
$APP1 = "App1"
$WFE1 = "WFE1"
$WFE2 = "WFE1"
$SearchAppPoolName = "SearchServiceAppPool"
$SearchAppPoolAccountName = "domainSearchSvc"
$SearchServiceName = "SharePoint Search Service"
$SearchServiceProxyName = "SharePoint Search Service Proxy"
$DatabaseServer = "DBserver"
$DatabaseName = "SP_Search_AdminDB" 

#Create a Search Service Application Pool
$spAppPool = New-SPServiceApplicationPool -Name $SearchAppPoolName -Account $SearchAppPoolAccountName -Verbose 

#Start Search Service Instance on all Application Server
Start-SPEnterpriseSearchServiceInstance $App1 -ErrorAction SilentlyContinue
Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $App1 -ErrorAction SilentlyContinue

#Start Search Service Instance on WFEs
Start-SPEnterpriseSearchServiceInstance $WFE1 -ErrorAction SilentlyContinue
Start-SPEnterpriseSearchServiceInstance $WFE2 -ErrorAction SilentlyContinue

#Create Search Service Application
$ServiceApplication = New-SPEnterpriseSearchServiceApplication -Partitioned -Name $SearchServiceName -ApplicationPool $spAppPool.Name -DatabaseServer $DatabaseServer -DatabaseName $DatabaseName 

#Create Search Service Proxy
New-SPEnterpriseSearchServiceApplicationProxy -Partitioned -Name $SearchServiceProxyName -SearchApplication $ServiceApplication
$clone = $ServiceApplication.ActiveTopology.Clone()

#Set variables for component creation
$App1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $App1
$WFE1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE1
$WFE2SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE2

#Create Admin component
New-SPEnterpriseSearchAdminComponent –SearchTopology $clone -SearchServiceInstance $App1SSI 

#Create Processing component
New-SPEnterpriseSearchContentProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create Analytics Processing component
New-SPEnterpriseSearchAnalyticsProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create Crawl component
New-SPEnterpriseSearchCrawlComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Create query processing component
New-SPEnterpriseSearchQueryProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI

#Set the primary and replica index location; ensure these drives and folders exist on application servers
$PrimaryIndexLocation = "C:SPSearch"
$ReplicaIndexLocation = "C:SPSearchReplica" 

#We need two index partitions and replicas for each partition. Follow the sequence.
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 0
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 0
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 1
New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 1

$clone.Activate()

#Verify Search Topology
$ssa = Get-SPEnterpriseSearchServiceApplication
Get-SPEnterpriseSearchTopology -Active -SearchApplication $ssa

This script is actually pretty basic. 90% of the services end up on your App box, while the Index partitions live on your WFEs. You could provision the service on one box or any combination you see fit. That’s the beauty of this model. For me, I like having the Index partition closest to where people will be searching (i.e. the WFEs).

The script should take anywhere from 10-30 minutes to run, maybe longer depending on your hardware. Once done, navigate to your Search Service in Central Admin and this is what you should see in the Search Topology.

Most important thing to remember when using this script is to create the C:SPSearch and C:SPSearchReplica directories on your WFEs PRIOR to running this script. The script will fail if you don’t do this and it’s a pain to cleanup after so create the directories first. I’ll probably write in a check to see if the directories exist – and if they don’t go, ahead and create them – next time I run this script when setting up an environment.

Delete user in asp.net membership provider

I have a SharePoint farm where I use the aspnet membership provider. Now and again I need to remove users due to separations, change of job, etc. so they can’t access SharePoint.

Like all of you I have to research this thing anew every time I have to do this. But no more! My future self will thank me.

Using the aspnet_Users_DeleteUser stored proc we can remove users via SSMS.

USE [database name]

EXEC [dbo].[aspnet_Users_DeleteUser]
@ApplicationName = '[Application Name]',
@UserName = '[Username]',
@TablesToDeleteFrom = 15,
@NumTablesDeletedFrom = 0

GO

@TablesToDeleteFrom can be a bit confusing when you open the stored proc up. It’s actually a bit mask. I typically have to remove users entirely so I use 15, but this blog post details out some additional options for that parameter: http://vsproblemssolved.blogspot.com/2007/01/using-sqlmembershipprovider.html

@NumTablesDeletedFrom is an input/output variable, meaning anything you put in there will get replaced by 0 in the query. You can use that parameter to inspect the output but I’m not that ambitious.

PerformancePoint 2013 and Tabular data sources #fail

I spin up my spiffy new SharePoint 2013 environment, migrate my PerformancePoint databases, and then try to hit the dashboards. I’m greeted by all kinds of errors. Take note: we’re using more and more tabular SSAS data sources at Trek.

Fast forward a few days. I open a ticket with Microsoft and begin troubleshooting. Engineer was a helpful chap. He had an idea from the get-go what the problem was, but wanted to check some environment variables before we go and start installing stuff.

Long story, short: you need to install the ADMOMD.NET SQL 2008 R2 feature pack if you want to hit tabular data sources (LINK - you’ll find the correct pack towards the bottom of the Install Instructions section). #lamesville #sql2012hasbeenoutforalmostayear

I asked the engineer to send me the Technet article stating this to which he replied “Wish I could.” Nowhere in any of Microsoft’s documentation does it state you need to install this feature pack in order to use tabular data sources in SP 2013. He did however send me this blog post so kudos to him for that:  http://blogs.technet.com/b/microsoft_in_education/archive/2013/04/29/configuring-performancepoint-in-sharepoint-2013.aspx