News

Microsoft announced that its Phi-3.5-Mini-Instruct model, the latest update to its Phi-3 model family, is now available. The Phi family is Microsoft's assorted compact micro models that can run on ...
Microsoft just release Phi 3.5 mini, MoE and vision with 128K context, multilingual & MIT license! MoE beats Gemini flash, Vision competitive with GPT4o? > Mini with 3.8B parameters, beats Llama3 ...
Microsoft has revealed the newest addition to its Phi family of generative AI models. Called Phi-4, the model improves in several areas over its predecessors, Microsoft claims, particularly in ...
adding that this weakness can be resolved by augmenting Phi-3.5 with a search engine, particularly when using the model under RAG settings. Microsoft used 512 Nvidia H100-80G GPUs to train the ...
Microsoft has published the latest version of its small language model Phi-3.5. This new version is a big upgrade on the previous generation, beating smaller models from leading players like ...
Microsoft has launched a series of AI ... science and coding applications. Meanwhile the Phi 4 mini reasoning model has 3.8 billion parameters and was trained on around a million synthetic ...
In April, Microsoft announced the Phi-3 family of language models, which compete directly against Google's Gemma family of models. The Phi-3 family has three models: The Phi-3-mini is a 3.8B ...
Microsoft Corp. has developed a small language model that can solve certain math problems better than algorithms several times its size. The company revealed the model, Phi-4, on Thursday.