AIIM’s 2015 SharePoint Survey Published: Adoption, Executive Commitment and Governance Still Major Challenges

AIIM has just published its annual white paper based on a survey of SharePoint users.  AIIM collected surveys on use of SharePoint from 400+ organizations.  Their key findings echo last year’s survey and show that successful implementation of SharePoint continues to be a challenge for many organizations.

You can download the white paper here.


Key findings include:

  • Many organizations have not upgraded to the latest version of SharePoint (either 2013 on premise or Office 365).  The most popular implemented version currently is SharePoint 2010.
  • An increasing number of organizations are now supporting multiple versions of SharePoint: over 50% of respondents were supporting  multiple versions.  This is caused in many cases by organizations not committing to full migration from older versions.  For example, an organization with SharePoint 2007 might leave all their existing collaboration sites running in 2007 while implementing 2010 or 2013 for a new intranet.
  • Key challenges for stalled or challenged implementations are lack of senior management commitment, lack of training and lack of information management planning.
  • Most organizations are still struggling with moving to the cloud, with 33% being undecided on whether they will move SharePoint to the cloud.

The findings line up with our experience – most of our engagements focus on working with organizations on defining their SharePoint program strategy, information management practices and basic user adoption approaches as these continue to be challenges in adoption of the platform whether on premise or in the cloud.

Read More

Microsoft Now Has Two Streaming Analytics Solutions: Apache Storm and Stream Analytics

Microsoft has launched its new Apache Storm service as part of HDInsight.  In addition, Microsoft also has its own Stream Analytics service that you can use for similar purposes, e.g. processing incoming unstructured data in real time for analytics purposes.

Which service should you use for your real time analytics processing?  Here is a high level comparison between the two services based on the current incarnations of the service (they are expected to evolve over the coming months).

  HD Insight Apache Storm Stream Analytics
Service Status General Availability Preview Only
Deployment Model PAAS PAAS
Pricing Model Priced per node instance Priced per volume and “streaming
Product Ownership Open Source Apache Microsoft
Language Support Java, C#, Python SQL language specific to Stream Analytics
REST API Available but only for monitoring and
Available for development
Maintenance Patching and maintenance included Patching and maintenance included
Interoperability Azure Event Hub, Azure Service Bus, Apache Kafka,
Apache Cassandra, HDFS, SQL Azure Database
Event Hub, Azure Blob Storage and Azure SQL

Read More

Extended Retention of Deleted Email Messages in Office 365

Microsoft has just announced that it is extending the period in which it retains deleted emails in Office 365.  Previously, if you moved an email to the Deleted folder, it would disappear after 30 days.  This has now been extended to indefinitely by default and can be set by the Office 365 Administrator.  Office 365 Administrators can also create custom retention policies for email.

If you are an Office 365 administrator, this means we’ll be updating the Default MRM Policy for everyone using Exchange Online over the next month. As an administrator, you also have control over this behavior. If you want to keep the 30-day policy or set a custom retention period, that can be done as well and you don’t even need to wait for the change. Also, if you have already created a custom MRM policy, (as long as it has a name other than “Default MRM Policy”), you don’t need to do anything and the change will not impact you.


Read More

New Power BI Designer Preview Released with Updates

Microsoft has just released a new version of Power BI Designer Preview.  There is a video here that explains the new improvements.

Key improvements include:

  • Performance improvements to query loading and excel import
  • Addition of a Dynamics CRM Online Connector
  • Navigator dialog improvements for previewing queries
  • Ability to add calculations for date/time columns
  • Usability improvements to field list display
  • Improvements to keyboard support
  • Various bug fixes

Read More

Shrink Your WordPress Azure Web Site Memory Usage by Switching Comments Form to Disqus

As traffic on this web site has increased, the memory usage on Azure Web Sites has been creeping up, overwhelming the quotas on my Shared plan.  A Shared Instance of Azure Web Sites is allocated 512 MB of “memory usage” per hour which can be overwhelmed quickly if you get a reasonable amount of traffic.


One simple way to increase your quota is to scale up the number of instances – each new instance also increases your quota.  In my current configuration, runs on 2 shared instances which provides me a quota of 1024 MB / hour.   I have scaled this up to 3 instances when there has been noticeably higher traffic. 

One of the ways I have found to keep this usage contained is a pretty simple one – I switched my comments form to Disqus.  I was receiving approximately 500-1000 comments a day that were all spam which I’m convinced was increasing the load on my web site by at least 50-100%.  Switching to Disqus has killed the spam entirely and more importantly completely offloads comments to another service. 

Turning on Disqus is pretty easy in WordPress – just sign up for an account and then install the disqus plugin and it will replace your default WordPress comment form.

Read More

Pushing Data into Power BI Preview Using the New REST API – Part 2

In my previous blog post, I described how to push some basic test data into Power BI Preview.  I have also uploaded the sample code to GitHub.  The previous article describes the basics for creating a dataset and adding rows – this article expands on this scenario by adding a Web API layer and allowing for pushing data from a AngularJS web site.

Pushing Data from a Public Web Site

Imagine a public web site with a set of voting buttons:


This little widget uses Bootstrap and AngularJS to manage the user interface.  In the backend, we created a basic ASP.NET Web API project to act as a receiver of the votes as triggered by the clicking of one of these buttons.

AngularJS is a great framework for integrating with ASP.NET Web API because both support JSON.  The AngularJS controller simply creates a vote entity, serializes it to JSON and sends it to a custom built vote controller.

public static void AddRows(Object myObject, EventArgs myEventArgs) { Random random = new Random(); double randomDouble = random.NextDouble() * 5; int randomInt = random.Next(1, 5); ArrayList rows = new ArrayList(); rows.Add(new TestRow() { TestColumnBool = true, TestColumnDateTime = DateTime.Now, TestColumnDouble = randomDouble, TestColumnInt = randomInt, TestColumnString = "test" }); PowerBI.AddRow(dataset, "testTable", rows); }

Authenticating with a Service Account

There are two ways we can get a token from Azure AD – we can redirect the user to a Microsoft authentication page and get them to login or we can supply a username and password as a service account.  We used the second option in this scenario to simulate an anonymous public web site experience.

NOTE: I have not put in anything to encrypt username and password in the sample code – this is something you would want to do in a production scenario in your web.config.

Processing the Incoming Data and Adding Rows

When the controller receives the JSON object, it authenticates using Azure AD with a preconfigured service account.  This allows us to authenticate within the Web API controller and and act as a proxy for public access.  The configuration parameters (username, password, datasetname) are stored in the web.config and/or the azure web site configuration when you publish the web site to Azure.

The controller uses the PowerBITransferService class to manage all the interactions with Power BI Preview – see the previous post for a more detailed explanation of what is going on under the covers.

For our scenario, we take the incoming vote and we add to the data by creating a VoteRow that has: 1) a vote count that = 1 (so we can count votes); 2) a vote date which is now; 3) a vote name (so we can group and filter by the name) and the value of the vote (e.g. from 1 as very unsatisfied to 5 as very satisfied).  The vote value allows us to create an average vote from 1 to 5 based on the rows provided.

Testing the Results

Here is a sample of the resulting Power BI Preview Dashboard.   This dashboard is updated in real time every time I click one of the voting buttons.


Read More

Pushing Data into Power BI Preview Using the New REST API – Part 1

Microsoft has just published a new REST API for Power BI Preview that allows you to push data into cloud based datasets sitting in their cloud environment.  Over the weekend, I put together some test code to explore the possibilities of the new API.  (You can find the sample code here on github).  The REST API is still quite limited in its abilities but supports a key scenario for the new Power BI Preview Service – pushing data into Power BI Preview in real time.

Getting Started

In order to access the Power BI Preview REST API, you will need to authenticate your application and your user identity through Azure Active Directory.  The way you do this is to set up an Azure AD and create a profile for your application under configuration.


The client id is a key to that is supplied by Azure to be included when you pass in credentials from your application.  The Redirect URI is the page for logging into Azure AD – for a console app this should be and for a web application it should be the page which you create to receive the token that Azure AD generates when you authenticate.  The page needs to be registered here and needs to match what is passed in along with the authenticate request.

You also need to grant permissions to the Power BI Service in order to use the REST API to push data.  image

Now that this is configured you can start building an application.

Scenario #1: Building a Basic Console Application

My first attempt was to build a basic console application that created a test dataset and pushed in some data.  In building this application, I also built a PowerBIDataTransferService class that manages the various interactions with the rest API.  I can use the same class with the second scenario below.

The way that the Power BI Preview API works is all JSON and HTTP based – you send in commands with JSON data as part of your HTTP call and Power BI Preview responds with an HTTP Response typically with JSON data included in the response.

Logging into Power BI REST API

The first step is to login.  The login method looks like this:

public void Login() { //Create a new **AuthenticationContext** passing an Authority. AuthenticationContext authContext = new AuthenticationContext(authority); //Get an Azure Active Directory token by calling **AcquireToken** if (Username != "" && Username != null) { UserCredential user = new UserCredential(Username, Password); token = authContext.AcquireToken(resourceUri, clientID, user).AccessToken.ToString(); } else { token = authContext.AcquireToken(resourceUri, clientID, new Uri(redirectURI)).AccessToken.ToString(); } }

The class supports two different scenarios: 1) you use Microsoft’s login URI for console apps and when you run the app it will prompt you to login or 2) you supply a username and password directly.   In either scenario, the key thing you get back is a token from Azure AD that you pass into each of your REST API Preview Calls.

Creating a Dataset

The next step is to create a dataset.  A dataset is a collection of tables and tables have columns that can be one of the following types: int64, bool, DateTime, string and double.  Creating a dataset involves structuring JSON data to represent this schema.  What I did was to use this web site to model C# classes that could be easily serialized to the JSON required.  I then used JSON.NET to serialize the dataset schema to JSON and send it off to Power BI Preview.

For sending basic DatasetRequests and HTTPRequests, I borrowed some code from Microsoft’s sample code – you can find the original code here.

Using this approach, our CreateDataset method is quite simple:

/// <summary> /// Creates a dataset based on a DatasetSchema. /// </summary> /// <param name="Schema">Dataset Schema represents the definition of dataset including dataset name, tables and columns for each table.</param> /// <returns>Created dataset as .NET object.</returns> public Dataset CreateDataset(DatasetSchema Schema) { try { //Create a POST web request to list all datasets HttpWebRequest request = DatasetRequest(datasetsURI, "POST", token); PostRequest(request, JsonConvert.SerializeObject(Schema)); return (FindDataset(; } catch (Exception ex) { throw; } }

When you create the dataset in Power BI Preview successfully, you will see an empty dataset in Power BI Preview with your table structure.  In my test console application, I created a table with a test int, test date, test bool, test double and test string column.

NOTE: There doesn’t seem to be yet a REST API method for deleting a dataset.  There also doesn’t seem to be methods for altering the dataset schema yet either.

Adding Rows

Once you have a dataset created, you can now add rows to the table.  I created a simple test that pushed a row with a random int every 5 seconds.

public static void AddRows(Object myObject, EventArgs myEventArgs) { Random random = new Random(); double randomDouble = random.NextDouble() * 5; int randomInt = random.Next(1, 5); ArrayList rows = new ArrayList(); rows.Add(new TestRow() { TestColumnBool = true, TestColumnDateTime = DateTime.Now, TestColumnDouble = randomDouble, TestColumnInt = randomInt, TestColumnString = "test" }); PowerBI.AddRow(dataset, "testTable", rows); }

Again, in this case I have encapsulated the raw JSON by allowing you to use a basic value object defined in C# that is then translated serialized dynamically into JSON when the request is passed.  The translation works by inspecting the object and translating the public properties into the appropriate JSON values.

Testing It Out

If you are successful and sending in the right REST API calls you will see a new dataset created and the table being populated with rows.  The cool thing about the new Power BI Preview is the dashboards are updated in real time so if you have created a graph you will see it being updated as data is added.  Here is an example of the data I added from my test application.


Read More

Pushing Data into Power BI Preview Changes the Game

One of the key improvements in the new Power BI Preview is the ability to push data into datasets instead of pulling data from data sources such as Excel, SQL, etc.  This is a big change because it creates the possibility of incrementally updating dashboards in real time and providing the ability to integrate Power BI Preview with analytics engines that push data such as Hadoop, Azure Streaming Analytics or sensors generating data in real time.

A good example of this is the demo just published that shows the integration of Azure Stream Analytics and Power BI Preview – stream analytics pushes data into Power BI so that it can dashboards can be updated in near real time.

Another example is the new REST API – Microsoft has published a new API that allows you to push data into a dataset instead of requiring you to preconfigure a dataset to pull data.  This API will programmers to send data into Power BI incrementally and in real time. 

Imagine the scenarios that this will start to enable in the near future:

  • Pushing data from Hadoop into Power BI to provide a dashboarding layer on top of analytics processing jobs.
  • Pushing data in real time through Azure Streaming Analytics
  • Pushing data from web sites, custom applications, etc.

The current REST API is quite limited but the potential is massive…stay tuned as it will inevitably improve over the coming months.

Read More

Power View, Power BI and Power BI Preview User Interface Compared

I built a demo dashboard for managing tickets for an IT department using Power View and then uploaded to the current Power BI in SharePoint, and the new Power BI Preview environment.  Here are the different user interfaces for the same dashboard. 

Excel 2013 Power View

Here is what the dashboard looks like in Excel 2013…


SharePoint 2013 Power BI (Silverlight)

Pretty close to the original, and fonts are the same as the original.  Everything is a little bit smaller because of the extra chrome.  KPI checkmarks are also quite a bit smaller.


SharePoint 2013 Power BI (HTML 5)

Fonts aren’t quite right…


Power BI “App”

Microsoft also has a Power BI App in SharePoint.  The dashboards are the same but the Office Online chrome is removed.


Power BI Preview

Same dashboard now in the new Power BI Preview environment.  Grid layout has been shifted (notice how the chart doesn’t go across the entire bottom of the screen), Fonts are not the same as the original and everything is a little bit smaller on the screen.


Power BI Preview Dashboard

Power BI Preview also has an interactive dashboard that you can visualizations to in order to combine data from multiple reports.  Here is what the same report looks like as a new Power BI Preview dashboard.  You can see that colors have been shifted and the fonts are also now controlled by Power BI.  In general, fonts are quite small, especially on a tablet or mobile device.

Note: You cannot pin tables to a dashboard yet…only charts.


Read More

SharePoint Online Audience Compilation Only Occurs Once a Week!

I was building a demo for one of our clients, and decided to experiment with content target through audiences.  I defined an audience called “Executive Management” which had the department set to Executive Management. 


I waited for a few hours but no compilation of the audience occurred.  I did some research and found this page that states that SharePoint Online compiles audiences once a week!

The Audience Compilation Timer Job for the User Profile Service Application runs weekly on Saturdays at 1 am in the time zone of your data center.  You can check the Timer Job Schedule report in the Service Administration portal to see the job’s last run time.

While on premise you can access Central Admin and run this job manually, there is no option in SharePoint Online so plan your audiences carefully and be prepared for slow updates for this feature.

Read More