Microsoft has published a Visio stencil and power point template that provides symbols for all the common Azure services. You can download the stencil and power point template here.
We have a folder in our intranet for all our sales pursuit documents. This document library contains all our pitches, presentations, proposals, etc. that we use to pitch work to our prospects and clients. We added some columns using SharePoint Managed Metadata Term Sets so that we could tag these files and we created a view that allows for filtering against these columns.
Here is an example of our Industry column:
We love the idea of using these columns to filter a master list of documents. In general, the interface works well and you can click on any of the terms and it will filter your list view down to those documents tagged to that term.
Unfortunately, we were finding the rendering of the page was very slow – e.g about 15-20 seconds to load a single page. After spending some time adjusting the view by adding and subtracting some columns, our tests provided some interesting results:
We have removed this term set from our view for the moment to speed up the page rendering and we’re investigating…
Recognizing that there is a great deal of concern of storing sensitive data in the cloud and having it accessible to hackers, the NSA, industrial espionage, etc. Microsoft has invested significant efforts to offer (to be launched in July) in a complete encrypted storage offering for Office 365 and One Drive. This is in addition to the platform level security features already in Office 365.
Fort Knox is Microsoft’s encrypted storage offering for Office 365. Microsoft has published some additional technical details in one of their SharePoint conference sessions. The video for this session is here. Here are the key technical details.
SharePoint has supported storage of content outside the default SQL server based content database since SharePoint 2010. Fort Knox takes a similar approach but improves on the architecture.
Files are stored encrypted in Azure Blob storage transparently to the end user accessing the file from SharePoint.
When a Fort Knox file is stored in Azure, it is split in several fragments. Each fragment is encrypted (using AES 256 bits encryption) with its own key. Each of these fragments are stored in separate Azure containers that are generated on demand.
This shredding architecture allows for massive scalability of storage and more importantly, very strong security at the file level. Imagine the challenge of having to reconstruct a set of fragments spread across dozens of containers, each encrypted with its own key.
These keys are also regenerated every day, making it even more difficult to gain access to the raw storage.
A master key is used to encrypt keys used to encrypted each of the fragments. These encrypted keys are stored in the content database, and the master key is stored in a separate key store.
With a master key stored online in Microsoft’s key store, this still allows someone with access to this master key to decrypt all the fragment keys and then use these keys to decrypt the underlying storage. This is less of an issue for a hacker scenario (although possible, given the level of fragmentation between tiers tougher to accomplish) but more of an issue of an NSA style “request” for your data. Assuming Microsoft were to comply with the request, they could ultimately still provide them access to your master key and decrypt the information.
The only real solution is to have master keys generated off the grid so that they could not be requested at all and not be in your cloud providers hands to hand over on request…however this would be difficult to implement and still have a useable business productivity portal because you would still need the master key to decrypt the files.
Visual Studio 2013 has been released since late last year, but there have been no business intelligence tools available. If you wanted to build cubes, SSIS packages, etc. you were stuck using Visual Studio 2012.
As with the previous version, when you install the add-in, select the option for “Perform a new installation of SQL 2014” even if you have an existing database.
Microsoft has announced that Power Map will now be available for all Office 365 subscribers that have Office subscriptions. This is a change where Power Map was previously only available through the Power BI add-on subscription.
In addition, the latest Power Map update provides the ability to have your tours on continual looping – apparently people are using tours for displays on kiosks or demos and there was no ability to repeat the tour in the previous version.
BizTalk 2013 R2 has just been released and as the Cloud grows, the need for more complex integration scenarios means that BizTalk and other integration technologies may receive renewed focus from enterprises trying to integrate a combination of external organizations (suppliers, vendors, etc.), custom applications, legacy applications and various data feeds that could be coming through a traditional internal network, a cloud network or a hybrid of the two working together to optimize the integration flow.
BizTalk has taken a little bit of a back seat in the Microsoft product family as parts of BizTalk such as workflow and communications have been moved into the .NET framework itself. Around 2011, there were a lot of “Is BizTalk Dead?” conversations being had as Microsoft moved to the cloud and started promoting Azure services and there were little improvements to the existing BizTalk engine.
In addition, the concept of a Message Bus has been introduced into Azure as a specific service (Azure Service Bus) that can run without BizTalk as well. Microsoft has now also launched BizTalk Services which is a PAAS service for BizTalk. You can also run BizTalk on Azure through IAAS as a virtual machine.
Microsoft has at least three key integration technologies – here is how they compare in terms of functionality, features and pricing:
|BizTalk Server 2013 R2||BizTalk Services||Azure Service Bus|
|Deployment Model||On Premise or IAAS||PAAS||PAAS|
|High Availability, Backup/Restore, DR||Yes||Yes||Yes|
|Custom Code, scripting||Yes||Yes||No|
|Long running processing||Yes||No||No|
|Business Activity Monitoring (BAM)||Yes||No||No|
|Service Oriented Architecture / ESB||Yes||No||No|
|Pricing||Per Core (4 Core Minimum) for On Premise or Per Hour running in IAAS||Per Hour||Per Transaction or per Relay Hours|
As you can see, the Azure Service Bus is more of a developer tool than an Enterprise Integration Bus. It provides basic queues, topics and relays but the rest is up to you – it lacks the mapping, adapters and enterprise monitoring to compare with BizTalk. BizTalk Services has come a long way and can compete well with on premise with two big exceptions: 1) lack of HL7/HIPPA support for healthcare organizations 2) there are a number of functoids that do not exist in BizTalk Services.
The following are key new features in BizTalk 2013:
Is this enough to have a BizTalk conversation in your enterprise?
Microsoft and SAP have announced that they will be expanding their long partnership in the following ways:
In SharePoint 2013, there are three types of application hosting models:
The goal of the Autohosted app model was to provide an easy way for developers to provision Azure resources such as web sites and SQL databases when deploying their custom apps.
As of Friday, Microsoft has discontinued the Autohosted app model. The program will officially end June 30, 2014 and if you have developed an Autohosted app, you are encouraged to move it to a provider hosted app model instead.
Incidentally, If you look for information on MSDN on Autohosted apps, you’ll see the following error:
Microsoft has introduced a new Microsoft Azure File Service which provides a cloud based file share that is accessible across all of your virtual machines running in Azure. Unlike Azure’s BLOB storage, the Microsoft Azure File Service provides access via the standard SMB 2.1 protocol used by on premise file shares. If you’re running a VM either using Windows or Linux, you can copy, create, move or delete files just like a traditional network file share.
This requirement is key for moving legacy applications to the cloud. Many of these applications use traditional file shares for storing files and without the Microsoft Azure File Service you were limited to file shares that were local to the VM instead of being shared across a number of VMs. With the new File Service, your legacy apps can run within a VM and manipulate files in the exact same way as if they were running on premise.
In addition to traditional file share protocols, there is also supplied REST protocols for managing files from your client side or server side custom applications.
Azure has been strictly for servers, not desktop clients until now. Running Windows 7 or 8 in an Azure VM is technically a violation of the Windows license.
Microsoft has recognized that as developers, we sometimes need a client for testing purposes. Starting this week, MSDN subscribers can now spin up a Windows 7 or 8.1 client VM and use it for testing purposes.