Adobe launched its own take on how smartphone cameras should work this week with Project Indigo, a new iPhone camera app from some of the team behind the Pixel camera. The project combines the computational photography techniques that engineers Marc Levoy and Florian Kainz popularized at Google, with pro controls and new AI-powered features.<br /> In their announcement of the new app, Levoy and Kainz style Project Indigo as the better answer to typical smartphone camera complaints of limited controls and over-processing. Rather than using aggressive tone mapping and sharpening, Project Indigo is supposed to use "only mild tone mapping, boosting of color saturation, and sharpening." That's intentionally not the same as the "zero-processing" approach some third [...]
Adobe today launched its most ambitious AI offensive to date, unveiling the Firefly AI Assistant — a new agentic creative tool that can orchestrate complex, multi-step workflows across the company [...]
Google is kicking off the fall tech event season (albeit in late summer) today with its Made by Google showcase. The headline attraction at the event is the Pixel 10 lineup, but there's plenty of [...]
After largely focusing Google I/O 2025 on the ways the company wants Gemini to change everything from searching the web to filmmaking, Google is finally ready to launch new hardware. The next Made by [...]
After largely focusing Google I/O 2025 on the ways the company wants Gemini to change everything from searching the web to filmmaking, Google is finally ready to launch new hardware. The next Made by [...]
After largely focusing Google I/O 2025 on the ways the company wants Gemini to change everything from searching the web to filmmaking, Google is finally ready to launch new hardware. The next Made by [...]