James Dartnell reports from Microsoft’s world-renowned Washington state headquarters, where the firm’s 1,000-person research organisation is now turning its attention to artificial intelligence in a bid to transform the ways that computers can understand physical objects and human beings.
Fir trees aren’t all that grows in Washington state. For the last 25 years, the seeds of life-changing technological ideas have been sown at Microsoft’s global headquarters in Redmond, the suburb of neighbouring city Seattle. Redmond may not boast the golden sunshine or glamour of Silicon Valley – you’re much more likely to find incessant showers and grey skies – but the brilliant minds behind cutting-edge technological innovation are densely packed in a town that is engulfed by picturesque lakes and mountains.
Microsoft co-founder Bill Gates supposedly insisted on constructing x-shaped buildings at Redmond’s Microsoft Campus, with the intention of providing all employees with access to windows – the glassed variety – and that kind of liberating thinking is what continues to drive the firm’s research and development work to date.
Joining Microsoft in 1997, Rico Malvar is now chief scientist of Microsoft Research, and he appreciates the value that can result from allowing over 1,000 highly intelligent minds to investigate the possibilities of what can be achieved with technology. “Philosophically, it’s very important that we can focus on independent research, rather than just doing research that is driven by business,” Malvar says. “The first mission statement of Microsoft is not about doing something for the company, but instead about doing something for technology.”
Microsoft Research has 60 research groups that feature employees who are qualified across a spectrum of specialisations, not all of which are technology-heavy. “We have people with PhDs in areas like sociology, psychology and medicine,” Malvar says. “Understanding these things is crucial to the user experience in technology.”
The flexible culture in Microsoft’s research projects splits them into four main categories: mission-focused, evolutionary, disruptive and ‘blue-sky’, and these quadrants dictate their level of ambition and criticality. “That’s not to say that we spend 25 percent of our time and resources on each level, but you need to put a certain percentage of your investment into blue-sky projects, because good things can come out of them,” Malvar says. “If you spend 0 percent of your time and money on the blue-sky quadrant, you will definitely miss opportunities.”
We do not even reach 10 percent of projects that are successful. However, the few that succeed make a huge difference.”
For a company that retains its status across the world as a powerhouse in platforms and software – last year turning in revenues of over $85 billion – a surprisingly low portion of Microsoft’s research projects actually come to fruition. “One metric of success is not just how successful the projects are,” Malvar says. “We don’t worry too much if a project actually sees the light of day as a new product or service, or as part of another one. The percentage of projects that succeed is very small – most actually fail, as research is a risky business. We do not even reach 10 percent of projects that are successful. However, the few that succeed make a huge difference.”
Malvar adds that the metric that truely counts is how many projects Microsoft’s research efforts can actually contribute to. “What really matters is the portion of Microsoft products and services that are the result of Microsoft Research – for that metric, we expect the number to be 100 percent,” he says.
Throughout Steve Ballmer’s 14-year tenure as Microsoft CEO, the firm increased its efforts in shifting from a “know-it-all” to a “learn-it-all” culture, and that ethos, in some respects, is also being reflected in the technology that the company produces.
For some time, Microsoft had realised that embedded intelligence would be a revolutionary force for technology in the coming years, and decided to take action. In September 2016, the firm established its AI and Research Group, the new organisation that is led by executive vice president Harry Shum and now employs over 6,000 people. “When we started doing work with machine learning algorithms, it was completely theoretical, but investing in intelligence is a big deal now,” Malvar says. “Soon, there won’t be a space for apps, systems and devices that are not intelligent.”
Speech recognition was one of the first areas that Microsoft research focused on. “In the early 90s, it didn’t work,” Malvar says. “The error rate started off at around 40 percent, and we hit below the 30 percent mark in the early 2000s. From 2000-2010 there was very little progress. Then deep neural networks come in, and we had the necessary computational power by then to reach an 8 percent level, which effectively allows you to recognise conversations with the use of contextual analysis.”
Microsoft has now achieved a 5.9 percent error rate, and speech recognition is set to play a crucial role in its Cortana voice assistant, as well as being deployed in Microsoft’s real-time Skype translator. The ways that computers can deliver services for the future through voice recognition are a top priority for research teams, and Microsoft is developing ways that user requests can be easily fulfilled. “The way computing works is changing,” Malvar says. “If I tell a computer that I want a haircut, the computer passes this information to a service provider, whose bot responds in English and offers a timeslot for the haircut.”
Motion sensors are another key focus within the AI remit. Malvar and Microsoft are currently working with Kinect sensors to discover a range of possibilities within the enterprise. “They’re already good at detecting body parts, and the motion of your fingers,” he says. “This opens the possibility of controlling a computer from a distance. You’ll soon be able to do touch-less touch.”
Quantum computing has become another long-term focus area for the company. “We don’t know if it will work in five or 10 years’ time, and if it does, it’ll be big,” he says. “You can do so many things, like break the cryptography behind the http protocol, so you can basically break the Internet as it is today.”
On a broader scale, the firm is undertaking huge research projects that will allow it to grow its cloud business, both in terms of the capacity it can offer, and the intelligence that its cloud services can provide. “With Moore’s law ending due to economic reasons, the world needs new architectures,” Malvar says. “To deliver intelligence in its extreme capabilities, you need enormous computation, which means you need to be in the cloud, and you can’t have a super-intelligent cloud unless it’s very powerful,” Malvar says. “We spend so much money on data centres – we’re constructing buildings that are full of data centres, and campuses of buildings so that we can design our own computers.”
Enter Project Catapult. Developed at Microsoft Research, Catapult has homed in on developing a chipset that the company could use within its own data centres to provide computing power for intelligent cloud services. FGPA (field-programmable gate arrays) chips are the result of Catapult, and can provide up to 50 times more computational gain, and currently run on more than 100,000 Azure and Bing server blades. A recent demo showed that Microsoft’s FGPA-powered supercomputers could translate all the works of Wikipedia from English to Spanish in 0.1 seconds. Malvar realises the potential of what else could be achieved with such technology. “If I have a problem in scientific discovery, or the Internet of Things, I can analyse an enormous volume of data in an instant,” he says.
With the management of an unprecedented volume of data becoming a top priority for the world’s tech giants, Microsoft is looking for ways it can deploy undersea data centres en masse. Data centres currently account for 2 percent of total power consumption in the United States each year, and are heavily dependent on water supplies for cooling. Power grid availability constraints, meanwhile, as well as the need to deploy racks to the world’s “coastal civilisation” have led Microsoft to look beneath the surface. Project Natick is the firm’s research project to determine the feasibility of undersea data centres, and saw its first data centre pod deployed in August 2015. The new data centres purport to have better speed and agility of deployment, lower total cost of ownership and higher reliability than land-based data centres. It seems only fitting that Microsoft, housed in rainy Redmond, seeks its next game-changing project underwater.