Why did Microsoft buy LinkedIn? Because it’s the single best way to understand the modern workplace and how people get jobs done — and ‘reinventing productivity’ is the first item in Microsoft’s mission statement.

Some of that knowledge about work is informing Microsoft’s vision for the future of Office, and offices generally, along with new devices, sensors, smart speakers, AI and search tools. The latter will apply what the company has learned from building Bing to the age-old question of why it’s easier to find just about anything on the web than the information you need about what your own company is doing.

The vision all came together in the Build 2018 keynote, in a demo Satya Nadella called “the future of modern meetings”. This used the kind of hardware PC makers are launching at Computex as Windows Collaboration Displays, as well as smart speakers for the office that Microsoft hopes partners will build using its Speech Devices SDK and recently unified Speech service, to recognise who was in the meeting room, transcribe what everyone said, translate it into Chinese, and show the ‘action items’ that people committed to in a list on the big screen (as well as adding them to everyone’s schedule with reminders from Cortana).

In the past, Microsoft has talked about self-contained ‘centre of the room’ devices like Skype Rooms, separate from the interactive whiteboard hanging on the wall (which Microsoft usually talks about as a device for the ‘front of the room’ or a hallway conversation). Now, they work together. And both rely on Microsoft services like Office 365, Azure Active Directory and the ever-growing list of Cognitive Services that can recognise faces, speech and even the concepts people are talking about.

SEE: Internet of Things policy (Tech Pro Research)

The prototype smart speaker looked like an updated version of the RoundTable video conference phone that Microsoft built and then handed over to Polycom. It uses noise suppression, far-field voice, and beamforming to improve speech recognition from multiple people in the room. It turns 360-degree video into a panorama with everyone in the room named — because they were recognised by facial recognition and linked to the Azure AD account on which they got the Outlook meeting invitation.

The combination of speaker recognition and facial recognition means the transcription shows who said what, which makes it easier for remote meeting participants to keep track of what’s going on and allows anyone who came in late (or missed the meeting completely) to catch up. In the Build demo it was also used for a live translation into Chinese — you can try translation today in Skype or by chatting on Microsoft’s free translation site. You can also try both transcription and translation in multiple languages using custom speech recognition based on the text in your slides with the PowerPoint Presentation Translator.

The custom speech recognition in the demo extended to company-specific product names and acronyms found in emails and Office documents by machine learning and added to the Microsoft Graph to make recognition more accurate. Word’s new Acronym feature will start to show acronym definitions just for your organization in a pane next to your document later this year (if you’re a commercial Office 365 user).

If there are relevant documents for a meeting, Cortana promises to find those automatically. That’s an extension of the Delve intelligent search and discovery in Office 365, which looks on OneDrive, SharePoint and Exchange to find documents created by co-workers that you have the permission to view and are relevant to what you’re working on. Cortana will even look in documents you wrote yourself, because you might have already said what you need to.

Today you can see a general list of documents Delve thinks will be useful to you in the mobile SharePoint app, but having the list pop up as part of a meeting request, or being able to search from inside a document you’re working on, saves you having to remember to go and see if there’s anything you should include. But with so many new tools, it’s easy to lose track of what they can do, and just keep working the way you always have.

Using offices better

The Build demo started by asking Cortana to schedule a meeting and find a meeting room with the right equipment. You can already sign up for a preview of this and cc Cortana on a group email where you ask people when they can meet (using either the free/busy information in Exchange servers or a series of emails that Cortana sends and reads the replies to). Microsoft’s internal version can find rooms with specific hardware, like a Surface Hub. Microsoft will even try to get you there on time: Outlook on iOS (and soon on Android) offers the same ‘time to leave’ reminders for meetings that Cortana can give you on Windows.

Expect multiple systems like this. Office furniture supplier Steelcase used Azure IoT and machine-learning services to build their own Find app for customers to use for booking rooms, because the usual 20 minutes it takes to find a meeting room for three or four people is such a waste of time. The app uses passive infrared to find rooms with the right equipment that are the right size that are empty (even if the calendar says they’re booked). If you’ve put confidential documents in the meeting agenda it will look for a meeting room that’s private — and if there are two possibilities it books both, so you can decide yourself. Instead of having to rush out at the end of your slot because people for the next meeting are queuing up outside, the app warns you 15 minutes before you’re supposed to leave if you want to book more time, or move to another room to carry on. And it asks everyone in the meeting to rate the room (did the technology work? Did you have enough privacy?) and passes the feedback to the facilities team.

SEE: IT leader’s guide to the future of artificial intelligence (Tech Pro Research)

If the problem is that there are never any meeting rooms free, Steelcase can put wireless sensors in rooms to help determine if they’re being used efficiently, or if everyone is just going to the nearest coffee shop instead because it’s not set up well. Its workplace advisor tool compares the sensor data that shows how rooms are used with the meetings that are booked there, combining that with information about energy use and room layout before running machine learning over it.

One office had two apparently identical rooms; one was used for four to five hours a day, the other for just 24 minutes a week. The lights were the same, the chairs were the same — but one room had the desk placed so your back was to the door and no-one felt comfortable working in it. Another customer found that their big video conference room was mostly being used for video conferences by large groups who booked it in advance, but the sales team were also using it as a very expensive phone booth so they could pace up and down while they made phone calls. (The solution to that was to put a treadmill in a smaller room.)

It’s possible that Microsoft will create a smart speaker to pair with the Surface Hub and deliver AI-powered meeting productivity, but more likely that partners such as Polycom will build a device. We’ll see individual pieces of this in Office 365 (especially in Microsoft Teams) and Dynamics, because of the Office Graph that connects users and their jobs and the documents they create and their devices, and all the other pieces of information that AI can mine to be helpful.

But by the time you use these features, they might look less like the grand vision of the keynote and more like the way Office already uses AI to filter email, design presentations, check your spelling and grammar, tell you when figures in a spreadsheet look unusually good or bad and generally help you get things done. Because if Microsoft really wants us to use these new productivity features, they have to show up in the tools people already use.

Also see