2025-02-27

Microsoft has added two new models to its Phi small language model family: Phi-4-multimodal, which can handle audio, images and text simultaneously, and Phi-4-mini, a streamlined model focused on text processing.
The article Microso [...]
2025-11-17
AI engineers often chase performance by scaling up LLM parameters and data, but the trend toward smaller, more efficient, and better-focused models has accelerated. The Phi-4 fine-tuning methodology [...]
2025-10-16
Microsoft is fundamentally reimagining how people interact with their computers, announcing Thursday a sweeping transformation of Windows 11 that brings voice-activated AI assistants, autonomous softw [...]
2025-10-28
Microsoft is launching a significant expansion of its Copilot AI assistant on Tuesday, introducing tools that let employees build applications, automate workflows, and create specialized AI agents usi [...]
2025-07-12
Microsoft has introduced Phi-4-mini-flash-reasoning, a lightweight AI model built for scenarios with tight computing, memory, or latency limits. Designed for edge devices and mobile apps, the model ai [...]
2025-10-17
AI models are only as good as the data they're trained on. That data generally needs to be labeled, curated and organized before models can learn from it in an effective way.One of the big missin [...]
2025-11-13
Mere hours after OpenAI updated its flagship foundation model GPT-5 to GPT-5.1, promising reduced token usage overall and a more pleasant personality with more preset options, Chinese search giant Bai [...]
2025-05-01
Microsoft is expanding its Phi series of compact language models with three new variants designed for advanced reasoning tasks.<br /> The article Microsoft's Phi-4-reasoning models outperfo [...]
2025-05-27
Microsoft's recent release of Phi-4-reasoning challenges a key assumption in building artificial intelligence systems capable of reasoning. Since the introduction of chain-of-thought reasoning in [...]
2025-01-08
Microsoft Research's new Phi-4 LLM matches the abilities of much larger models while using just 14 billion parameters - about one-fifth the size of similar systems.<br /> The article Micros [...]