Meet Orca, a new 13 billion parameter
AI model from Microsoft
Orca, an AI model with 13 billion parameters, can mimic and learn from complex language models like GPT-4.

In addition to developing smaller, more case-specific models, Microsoft has been slowly integrating AI capabilities into its products and services in collaboration with Open AI. Orca, a new AI model from Microsoft Research, learns by copying substantial language models. By copying the thought processes of huge foundation models like GPT-4, the study article claims that Orca is made to transcend the limits of smaller models.
Large language models like GPT-4 can be used to train and optimize language models like Orca for certain tasks. Orca uses less computer power to run and function as a result of its lower size. Researchers can individually run and optimize their models based on their needs.
0 Comments