Philly .net Code Camp Follow-up

A quick post to list a couple of resources mentioned in the session today.

First, if you are starting Office development for Office 365, Office or SharePoint Add-ins you need to start with dev.office.com

Second, the code presented is up on my gituhub repository. This contains the final code I showed today.  You do need to update the web.config with the values for your tenant.

Finally, keep a look out for video of the presentation.  I will post a note here, so subscribe if you are interested.  The video will be a number of smaller videos focusing on some aspect of the presentation.  This will allow me to go into better depth for some of the more technical content.

There was a lot to cover in this session.  If you have any questions, please me t ne know.

SharePoint Saturday: Building Applications with Office 365 Unified API

In October 2014, the Office 365 team released a set of APIs to communicate with the Office 365 platform. The API was great for reading and writing content.  It also integrated nicely with the Azure AD Services and the Active Directory Access Libraries (ADAL).  The API exposed specific features, mainly Mail, Calendar, Contacts and Files.  In addition, these same APIs provided a connection into the SharePoint site on your tenant.  Communication was mainly through REST calls, but the url for these services was different.  Mail, Calendar and Contacts all pointed to the https://office365.outlook.com/ url and Files and SharePoint used the url of the tenant of the authenticated person.  Helping to sort all of the urls out was the Directory Service, which given the authenticated user and the specific resource in question, would return the appropriate base url.

At Build 2015, the Office Dev team announced a preview of the next version of this API called the Unified API.  The Unified API stems all API calls from a common url https://graph.microsoft.com/.  The Unified API provides one token that is valid for all of the resources defined in the delegated permissions in the application configuration. 

I am giving a session on this topic at two SharePoint Saturday sessions in July, one on July 18 in Baltimore and the other on July 25 in NYC

The slides for the session are available on my OneDrive.  These started from the slides from the Build session, but edited for the content I wanted to present. 

Connecting OneNote to WordPress

I’m a huge OneNote fan.  I’ve actively used the product since it was first released with Office 2003.  Even with the local storage of OneNote files, it was easy to pick things up and move to a new machine or rebuild when needed.  Take this to the current day and all of this content is stored in my OneDrive, so it follows me from device to device or from a machine rebuild and a login with my Microsoft Account.

I’ve also been a huge fan of LiveWriter.  It works with my WordPress.com site as well as SharePoint and other blogs.  There is a rich supply of add-ins created by a passionate community.  However, the last release is from 2012.  While it does continue to work well (I am using it to create and publish this post), people within Microsoft are trying to get this upgraded or at least moved to an open source project so the community can take over this product.  If you are interested, go to the link and retweet.

Note: if you are looking for LiveWriter, this blog post has the correct links.  Use this to get the latest version and not your favorite search engine.

So when the OneNote team announced that you ‘could connect OneNote to WordPress’, I was excited that this would provide an updated solution to posting content to WordPress.  I’ll be honest, I was truly disappointed in the solution. 

The solution is actually a plug-in for WordPress.  It contains files you deploy to your WordPress server and these files provide a toolbar icon in the WordPress post creation window.  This button looks at the OneNote instances stored in your OneDrive Personal only.  OneDrive for Business, SharePoint or local OneNote instance are not supported in this plug-in because of OneNote API limitations.  This may evolve in the future, but it is the current state.  Select the page that contains your blog post content and it appears in the WordPress textbox for you to continue editing.  It also looks like this solution is not available if your blog is hosted on WordPress.com. 

While it is great that this plug-in exists, this is not how I write posts.  I avoid the WordPress editor except when absolutely necessary.  Is it a bad editor, probably not.  It just does not fit into my workflow.  I jot down lots of ideas for blog posts in OneNote in sections for both work and personal and even use it to initial outline the content.  What I was hoping for was a OneNote ribbon button or side panel, where I push this content directly into WordPress once it is completed, much like LiveWriter does.  Yes, there is a method of publishing content from OneNote into WordPress, but it needs Word as the intermediary that creates a Word document and pushes the content and keeps track of your blog settings.  This may be because OneNote is very flat from a formatting perspective, and it needs to be.  Word can provide some interesting format options required for some blog posts.  However, I don’t need a Word file in addition to the OneNote content.

OneNote has a rich set of APIs, as does WordPress.  I think it may be time to take a look and see what can be done to push instead of pull this content.

Microsoft HoloLens

One of the activities I anticipated in coming out to Build 2015 was experiencing the HoloLens.  I saw the announcement in January and watched the related video a number of times leading up to Build.  Of course the hope was to have this as the developer ‘give-away’ which would be incredible and allow us to build for the device.  And yes, as Alex stated during the keynote that “I have hundreds of these devices”, me and those around me grew very excited at what that might mean.  Then the realization set in that hundreds would not be enough for all given the thousands of us here.  Alex’s plan for these hundreds of devices was to provide sessions where attendees could review the devices in a couple of scenarios.  I chose the Hands-on Developer option and within a couple of hours received an email with my acceptance into the Holographic Academy. 

Development Experience

The development experience was a full 4.5 hours immersion into the world of Holographic development.  I had my own workstation and HoloLens device attached by USB.  The session followed a script designed to get you comfortable with both the tools and the device.  After a short sample app to get used to the HoloLens, we moved on to recreating the ‘Origami’ app.  The development takes place equally in Unity and Visual Studio. 

Unity is a development tool very popular in game development.  It provides a 3D area for placing objects, lights, camera and defining the various actions for these items.  The assets we needed for our application were provided in our workstations and it was a matter of placing them in the correct location in the object tree.  This part reminded me a little of xaml development in Blend.  Actions for the objects were defined by creating C# classes and dragging them onto the object. 

The C# code is modified using Visual Studio.  The code is there to look for events or periodically get the state of the HoloLens view, which reminds me of game or XNA development.  See if my world has change and do something in response to that change.  The lab provided the code to include the classes so we could spend time seeing the implementation in the device.  However, there was flexibility in how we coded certain components, especially in the voice recognition.  The lab code included standard phrases.  However, we were encouraged to include our own text and make it respond to that.  This worked quite well without any ‘voice training’ for the device and with other people around interacting with their own devices.

After the objects are bound and the code modified, the Unity application builds a Visual Studio solution and places it in a folder via a custom add-in to Unity, which is not publically available at this time.  The solution is opened in Visual Studio and the application deployed to the USB connected HoloLens device.  It is important the HoloLens is held while the application is launched so that it points to the location where it will be used as the code specified placing the object x number of inches in front of the device.  Once deployed, the USB can be removed.  Now untethered, you can view the app in 3D while walking around as well as closer and farther away.  This type of movement around the visual is accentuated by the spatial sound enabled in the device making the experience more realistic without any of the nausea/motion sickness sometimes experienced in other devices.  I was also able to see and carry on conversation with members of the HoloLens Team present in the room. 

The development experience was simplified for this short session, but I think this was enough to get a sense of the techniques needed to make this work.  I am guessing that this type of development is only needed for the immersive, 3D applications.  Universal Applications, which are also Holographic Applications, would just be created in the current development model in Visual Studio.  I would also think that these would appear for placement from some kind of app store interface instead of the deployment scenario described here.  This also gets into the question of the storage capabilities of the device which we did not discuss.  I look forward to getting my hands on a device and seeing what it can do without a script.

OneDrive for Business Development

April was OneDrive for Business month at TriState SharePoint.  ODFB or OD4B, depending on your preference, is the corporate user’s, cloud storage solution available as part of Office 365 SharePoint (Sites) or as a stand-alone product.  This is not to be confused with the OneDrive Personal you get as part of your Microsoft Account. 

For my part of the discussion, I focused on the development hooks available for ODFB.  This blog post describes the information presented in the meeting  I pulled this information from available resources on MSDN, dev.office.com, and Microsoft Virtual Academy.  I link to these sources in the content of the blog and don’t repeat the specifics here.  You can check out the links for the greater details.

I grouped these into 3 possible options, Provision, Customize, Develop.

Provision

Much like a MySite in SharePoint, ODFB is not created by default.  It is only instantiated when a user navigates to the site for the first time.  This can cause a slight delay while it is being created.  In one-off situations, the wait is not terrible.  If you are launch a new site to you user community, you may want to create those sites ahead of time and save unnecessary calls to the service desk.  Scripts are available in CSOM and PowerShell.  Both languages follow the same pattern. 

  • Connect to the SharePoint instance.
  • Get a collection of users
  • Connect to the User Profile object
  • Create ODFB instance. The collection of users should be less than 200 per batch.  There is a TechNet article describing the full process with the PowerShell code..

Customize

What is the first thing everyone wants to do with their SharePoint environments?  Make them not look like SharePoint.. While you can change the masterpage, this is not the recommended practice.  However, there are ways to change the appearance and content of the ODFB site.

First, why would you want to do this?  You may want to change the colors used on the ODFB site.  Maybe add a background image or do other things that make the ODFB site look more like the rest of your custom tenant sites.  All of this is possible with the patterns described here.  One other reason is for governance.  There may be content you want to make sure is included with each ODFB site.  This could be Sales Contract templates for the Field Sales Team or just standard policy documentation.  These patterns can also be used to check and confirm that the content remains included on the user’s ODFB site.

There are three design patterns available for enabling these changes.  They are App Part Model, Scheduled Job Model and the Pre-create Model.  All of these patterns are described in more detail in this post.  I’ll briefly summarize here.

App Part Model

You cannot deploy applications directly into the ODFB site like you can on a team site.  One method of getting around this is to create an app part that sits on an Office 365 page that is common to your users, like the home page of your intranet site.  When the home page loads, the app part runs and calls a provider hosted app so it can create an entry in a queue for the user with the information needing to change on the ODFB site.  A scheduled job looks at the queue and checks for work and performs the customizations described in the entry.  This model is needed as the process can be long running and would not be a good idea to run in the provider hosted app directly.  There is a great solution included in the Patterns and Practices repository on the OfficeDev GitHub.  It is complex, as it is a complete solution included console apps to apply and reset the customizations.  There is also a channel 9 video walking through the project.

Scheduled Job

A scheduled job is nothing more than some code, usually an executable, like a console application, that gets executed on a recurring basis by some process.  In SharePoint, these would be Timer Jobs, which we are not available in Office 365.  As a an IT Pro, you might think of these as Windows Scheduler Jobs that run on a server.  In Azure, the equivalent is a WebJob.  The webjob runs within an Azure website and will run an executable on a defined interval.  Regardless of how you architect it, the executable portion of the scheduled job runs code that queries Office 365 for new ODFB sites.  When it finds them, it applies the customizations defined in the code.  The potential limitation with this approach is that people may get to the site prior to the customization being applied as search does take time to populate with the new sites. 

Pre-create Model

I mentioned the provisioning options available in the first part of this post.  It is possible to include the code for customizing the ODFB site at the time it is created.  This provides a simple way of getting the sites and the customizations applied from the start.  Unlike the other two models, creating site for additional users and resetting user changes is not easily handled in this model.

Develop

In October 2014, the Office Dev Team RTMd the Office 365 APIs.  These APIs were designed to make the consumption of Office 365 services easy for the developer and to make them available on every major development platform, including iOS, Android and PHP.  If you are interested in this type of development, start by going to dev.office.com.  It has all the links for training, code samples and the tool I showed at the end of my talk, API Sandbox.  This site offers an easy way to try out getting data from any of the services exposed directly through the APIs, Calendar, Mail, Contacts and Files.  There are samples are client library and REST samples written in both C# and JavaScript.

As for the actual project sample, I stepped through lab project associated with the Microsoft Virtual Academy course, Deep Dive: Integrate Office 365 APIs in Your Web Apps.  Section 3 focuses on ODFB integration and includes a lab in which you build a web site that includes a page showing the file in the user’s ODFB.  The code for this lab is available here.  If you clone this repo, move the project to a local folder and open it from there.  The folder names are too long in the repo and they prevent Nuget from updating the packages.  Even with that move, I still had issues with the OWIN package.  I had to remove it and it’s dependencies using uninstall-package command and then reinstall again.  This was a little troubling but it did solve my build issues.

That’s all for the recap.  Ping me on twitter or leave a comment if you have any questions.

Office 365 and Yammer Administrators

I recently completed a Philly MTC Tech Talk on Yammer Administration.  You can see the video here.  This talk was mainly about the Yammer administrative tools available in an Enterprise version of Yammer.  I did start out with some steps on enabling your Yammer Enterprise instance.  These steps are all listed on the Activation pages of http://success.yammer.com site.

Life will go easier for you if you follow the steps and assign a Global Administrator who does not have a generic username prior to activation.  Yammer only assigns named users from the O365 Global Administrator group as Verified Admins on Yammer.  So the default MOD Administrator account, with the username ‘admin’ is not listed as an Admin in Yammer.  That is why you need to identify and assign one as part of the steps in provisioning the Enterprise instance. 

I have found that you can do this after the fact.  I had a previous demo instance where I did not follow the order exactly and this resulted in not being able to see the admin menu.  I removed and readded my named user to the Global Administrator group, and they now had access to the Yammer Admin area.

It is important to note that any Global Administrator you assign in O365 are automatically Verified Administrators in Yammer.  This means they have access to all data in the Yammer environment, public and private.  They can also export data, manage users and manage integrations.  You might expect the GAs to need access to the later functions, but the viewing of private data can be a concern to some networks.

Often administrators will have two accounts, their normal domain account and an admin account with the additional permissions an admin requires.  Given the current limitation with assigning administrator permissions to accounts beginning with ‘admin’, you can create Global Administrator accounts in the format of ‘admin-username’.  Creating an account this way will not add it as a Verified Administrator in Yammer.  This is not a solution, but for some networks, this may be enough.  There does need to be a better solution going forward

If you need to create Network Admins, this is done on the Admin tab of the Admin area.  Enter the email address of the person, select the appropriate pressure and click the Submit button.  Network Admins can also be promoted to Verified Admin in the same area.  Administration of Yammer Admins assigned by the Global Administrator group can only be managed in the O365 Admin area for Users and Groups.  You can see the differences in the screenshot below.

CurrentAdmins 

As always, put your questions below or find me on twitter.

Yammer: Accounts and External Networks

Typically, access to Yammer comes from the account associated with your company’s Yammer instance.  This is your company email address and the domain of that email address is the name of your Yammer Home Network.  For example, sign in with me@mycompany.com and your home network is mycompany.com.  You can see this in the url as it will be https://www.yammer.com/mycompany.com/.

In addition to the home network, you can participate in external networks.  An external network provides the same features as your home network but includes people outside the mycompany.com domain.  These people can be invited by someone from the home network, or if the network is public, a user can request access.

There are number of great reasons to participate in external networks.  Microsoft and Yammer Teams use these heavily to collect feedback, provide support and share information with the community.  External networks are also a great way to connect with your customers by creating your company’s own separate external networks for project engagements or events like tradeshows and conferences.  In general, any use case where you want to securely collaborate with people outside of your corporate home network is a good reason to explore external networks.

Memberships to external networks are tied to the account used to request access to the external networks.  In the typical case, this is your home network account.  You are logged into the mycompany.com home network and then request access to an external network.  The approval is associated with your mycompany.com account and this works well.

A problem arises if you should leave mycompany.com.  As part of mycompany’s processes, mycompany.com disables your domain account if they configured Directory Synchronization or if not configured, someone manually starts the disable process of your account in Yammer.  You now no longer have access to the home network, mycompany.com.  You also no longer have access to any external networks associated with that account.

You can see this by accessing an external network directly through the url.  I recently changed companies and my previous account had access to the Office 365 IT Pro Network.  The direct url to this network is https://www.yammer.com/itpronetwork/.  Accessing this url with my previous company Yammer credentials gives the following notice.

O365NetworkNotice

So as a Yammer user, what can you do.  What I am now doing when needing access to external networks is to make the request using my personal email which just happens to be associated with my Microsoft Account, although any personal email account seems to work.  This is a good fix for those who are at a company where they are not using Yammer (at least not yet!).  I also would only use this for access to external networks that go beyond employment at any one company, like the Office 365 Technical Network or Yammer Developer Network.

There is a downside to this if you are using your corporate account to access your company’s Yammer network.  You will need to logout/login to switch between corporate and external networks as the browser can only hold one yammer connection at a time.  You can keep multiple yammer connections open by using normal and private mode for the same browser or by connecting with two different browser applications, like IE and Chrome.

What I would like to see is a change to the yammer account where it is based on my Microsoft Account so it provides access to external networks I have associated with it regardless of where I am employed.  This new yammer account also has one home network, which is tied to my corporate account.  This way if I change companies, external access remains but the home network is blocked until I enter a new set of credentials for the home network.  Just a thought.

Tweet me or post any questions you have here.

June 2014 Schedule

June is Code Camp month! Lot’s of preparation still underway.

Here is the list of events I plan to attend for June.

Thursday, June 5: Philly Game Works (Microsoft – Malvern, PA) – Have not been able to get out to this new group. They meet once a month and you can check out there site and find them on Meet up if you are interested in attending.

Tuesday, June 10: TriState SharePoint User Group (Microsoft – Malvern, PA) – I am taking over this meeting. I am presenting on SharePoint Social. I have a separate blog post that will describe the talk going up before the end of the week. In addition to doing the main talk, I am also doing the ‘On SharePoint Development’ session as well, where we will look at how to develop against the Yammer API.

Saturday and Sunday, June 21 & 22: Philly .net Code Camp (Valley Forge Casino – Valley Forge, PA) – Session is scheduled for 1:30 on Saturday, but this topic will likely change. I will be around the rest of the weekend as well. There is still room if you are interested in attending. There is a great list of Microsoft community and local speakers presenting some great content. (How did I get in here? J)

Lite month as the Philly .net team is focused on Code Camp, so no monthly meetings. They will pick up again in July.

Yammer Analytics – Basic Reports

New post over on Perficient’s Microsoft blog about accessing and understanding Yammer’s basic reports.  This is part 1 of 3 leading up to SPSPhilly presentation on Yammer Analytics.

 

Access Services in SharePoint 2013

This is a brief recap of my Philly .net Code Camp 2013.1 presentation.

Detailed information about the requirements and configuration of Access Services can be found in this TechNet wiki. It is important to note that this functionality requires the following versions of the related applications.

  • Access 2013
  • SharePoint 2013 Enterprise Edition
  • SQL Server 2012

Access 2013 still provides the classic ‘Desktop’ databases we built over the past 20 years. The new version also includes a ‘Web App’ database which creates a SharePoint hosted app for the web front-end and a SQL Server database for storing the content. This is a huge change from the 2010 version of Access Services were the tables were converted to SharePoint Lists. SPLists provide a better scalability story than the traditional desktop database, but still have the list throttling limitations inherent in a SP List. Moving the table content to SQL Server addresses both the scalability and size issue by providing a true relational database engine to support the data. The use of SQL Server is completely transparent to the user as this communication is handled by Access Services communicating directly to SQL Server.

Access 2013 also provides some templates to get the process started. Many of the templates come in both desktop and web app versions. It is important to note as there is no switching from one version to the other after the template is chosen. There is also no upgrade path from previous versions of Access databases into the Web App database. This may be one reason why SharePoint 2013 still contains a legacy service for hosting Access 2010 web databases that come over in a content migration. Also desktop databases can contain VBA code behinds. So while the Access database artifacts all have complementary SQL Server objects, there is no analogous object for VBA code in Access 2013 Web Apps. This presents one of the limitations to the current implementation that changes are more configuration than customization via code. I think this is an area that will be improved in further releases, most likely by allowing javascript and CSS changes as opposed to c#/VB code behind files.

Let’s talk about the SQL Server component. As mentioned this must be a SQL Server 2012 database. It is also recommended that Access Services use a SQL Server instance separate from the SharePoint Farm instance. This will provide isolation for your Access Web App databases in case you need to connect to these databases through other means. A packaged web app deployed to the SharePoint Apps site creates an instance specific database each time the app is added to a site so these databases can quickly multiply. The Configuration Wizard will use the existing SharePoint Farm Database as the location for storing Access Web App databases. This can be managed in the Access Services page under Manage Service Applications in Central Admin. It is important to note that the databases created for the Access Web Apps are not included in any of the SharePoint Farm backups. These databases must be included in your disaster recovery scenarios for your environment to ensure this data is protected.

On the SharePoint side, the custom web app, once deployed to the app store and added to a site, takes on the properties of the hosted site. SharePoint permissions work on the Access Web App just like they would on a SharePoint list. Any site specific branding is also applied when viewing the Access Web App.

There is more coming on this subject as we had some good discussion and interesting questions raised during the session.