GitHub and OpenAI launched a technical preview for a new AI tool called Copilot. It lives in the Visual Studio Code editor, and autocompletes code snippets.According to GitHub, Copilot does much more than parrot back code it has seen before. Instead, it analyzes what you have written and generates matching code. It also includes specific functions that were called previously. The projects website includes examples of automatically writing code to import tweets and draw scatterplots.According to Nat Friedman, GitHub CEO, it works best with Python and TypeScript.GPT-3 descendantGitHub views this as an evolution in pair programming. Two coders work together on the same project to learn from each other and speed up development. Copilot allows one coder to be virtual.This is the first major outcome of Microsoft's $1 billion investment in OpenAI, the research company now headed by Sam Altman, president of Y Combinator. OpenAI, which was previously a non-profit, has become a capped-profit entity, taken on the Microsoft investment and licensed its GPT-3 text-generation algorithm.Copilot uses an algorithm called OpenAI Codex. OpenAI CTO Greg Brockman refers to it as a descendant from GPT-3.GPT-3, OpenAI's most popular language-generating algorithm can produce text that is almost identical to human writing. The algorithm's 175 billion parameters or adjustable knobs, which allow it to connect letters, words and phrases between sentences, make it so convincing.GPT-3 generates English; OpenAI Codex generates codes. OpenAI will release Codex via its API this summer, so developers can build their own apps using the tech, an OpenAI representative told The Verge by email.Codex was trained using terabytes openly accessible code pulled from GitHub as well as English language examples.Although Copilot is praised for its productivity benefits, GitHub suggests that not all code used was vetted for vulnerabilities, insecure practices, and personal data. Although Copilot has been able to avoid offensive language from being generated, the company claims that they have implemented some filters.Copilot's website states that GitHub Copilot can sometimes produce undesirable outputs due to the pre-release nature the underlying technology.OpenAI seems to have not yet found a way for algorithms to inherit the worst elements of its training data, despite criticisms from GPT-3s biases and abusive language patterns.The company warns that the model may suggest phone numbers, API keys, email addresses or other contact information. However, this is uncommon and data that has been generated by the algorithm has been proven to be either synthetic or pseudo-randomly generated. The code generated by Copilot, however, is original. GitHub performed a test and found that only 0.1% of the generated code could be found in the training set.This is not the first project that attempts to automatically generate code for programmers. Kite, a startup that offers similar functionality, has more than 16 code editors.Copilot is currently in a limited technical preview. However, you can sign up for the project website to have a chance at accessing it.