By AI Trends Staff
At a recent Build for Developers conference, Microsoft unveiled the first features in a commercial product powered by GPT-3, a natural language model developed by OpenAI, a Microsoft-supported artificial intelligence research laboratory.
The beta version of GPT-3, released in June 2020, has a capacity of 175 billion machine learning parameters, making it the largest language model to use deep learning. The largest language model available before GPT-3 was Microsoft’s Turing NLG, with a capacity of 17 billion parameters.
In September 2020, Microsoft entered into an agreement with OpenAI to license GPT-3 for its own products and services. In July 2019, Microsoft and OpenAI had announced a partnership involving a $ 1 billion investment from Microsoft in work that included building a supercomputer on Microsoft’s Azure cloud platform. InfoQ.
Originally non-profit, OpenAI switched to the hybrid model in March 2019 with the intention of raising investment capital. Microsoft has called its relationship with OpenAI “exclusive.”
A “Low Code” tool for the “Citizen Developers” program
In the building, Microsoft GPT-3 integrates with Microsoft Power Apps, which it describes as a “low-code” application development platform designed for users with little or no coding experience. These users are also called “citizen developers,” meaning they create application features for themselves or others, work outside of the IT department, and often report to the business unit.
For example new artificial intelligence-based features allow an employee who builds an e-commerce application to describe a programming goal in a colloquial language, such as “find products where the name begins with“ children. ”The fine-tuned GPT-3 model offers options for conversion. source programming language, Microsoft AI blog.
Microsoft runs GPT-3 in Azure using Azure Machine Learning. Power Fx is built for Microsoft Excel.
“Using an advanced artificial intelligence model like this can help make our low-code tools more widely available to an even larger audience as we truly become what we don’t call codes,” said Charles Lamanna, vice president of Microsoft’s small code application platform.
He noted that Microsoft’s agreement with OpenAI allows it to license the code behind the GPT-3 model that allows it to integrate the technology directly into its products. “This allows people to ask and research information in ways they literally couldn’t do before,” Lamanna noted.
The new features of tapping the GPT-3 allow the user to type in plain language, such as: “Show 10 orders with strollers in the brand name, and Sort by date of purchase, latest at the top” produces a rather complex formula.
The user still needs an understanding of the code he is implementing; features are designed to help users of the Fx programming language choose the right formulas to get the results they need according to Microsoft. The announced new features in Microsoft Build, according to the company, will be available in English preview throughout North America by the end of June.
Microsoft aims to expand the range of users who use the tool. “It’s not about replacing developers at all, it’s about finding the next 100 million developers in the world,” Lamanna said.
Risks of large language models
Large language models are able to study language models that can be found by scraping essentially all the textual information available on the Internet. Thus, models choose sexist, racist, and offensive language, among other things. “The text they produce can be toxic in unexpected ways,” a recent report said. Limit.
Microsoft has created restrictions designed to minimize the risks of GPT-3, “but the core of the program is still based on online language models, which means it retains this potential for toxicity and prejudice,” The Verge account suggested.
In an interview, Microsoft’s Lamanna told The Verge that the company is working to address this risk, for example, by introducing a list of words and phrases to which the system does not respond. The relationship between safe use of the program and restrictions on its functionality is a challenging compromise. “Like any filter, it’s not perfect,” Lamanna noted.
Users need to validate all formulas written by artificial intelligence, he said. “A person decides to inject an expression. We never inject an expression automatically, ”he said.
The GPT-3 license was seen as an advantage for Microsoft
The worlds of scientific research and applied artificial intelligence clash with Microsoft’s efforts to commercialize GPT-3, suggests account TechTalks written by site founder Ben Dickson.
“There is a clear line between academic research and commercial product development. The aim of academic artificial intelligence research is to transcend the boundaries of science. That’s exactly what the GPT-3 did, ”Dickson said. In commercial product development, “You have to solve a particular problem, solve it ten times better than incumbent operators, and be able to run it on a scale and cost-effectively.”
The OpenAI-Microsoft alliance is relevant to both companies. “OpenAI would have a hard time finding a way to enter existing markets or create new markets for GPT-3,” Dickson noted. “On the other hand, Microsoft already has the necessary tracks for OpenAI to lead to profitability.”
Microsoft has reached industries, thousands of organizations, and millions of users of Office, Teams, Dynamics, and Power Apps. “These applications provide complete platforms for integrating GPT-3,” Dickson noted. Microsoft has exclusive access to the GPT-3 code and architecture an advantage over its competitors. “Whatever case any business case finds for GPT-3, Microsoft will be able to do it faster, cheaper and with better accuracy because it has exclusive access to the language model,” Dickson noted.