From www.cultofmac.com
Apple researchers released an open-source artificial intelligence model called MGIE that responds to a user typing natural language — simply saying what they want — to alter an image. Apple’s new AI image editing tool could wind up making it easy for iPhone or Mac owners to tweak photos
MGIE — short for MMLM-Guided Image Editing — can handle Photoshop-style tweaks, global photo optimization and local editing. But output quality remains limited for now. And only Apple knows what it plans to do with its AI image editor.
Simply tell Apple’s new MGIE AI model how you want to edit an image
MGIE came out of a collaboration between Apple and University of California, Santa Barbara researchers, VentureBeat reported Tuesday. The researchers wrote a paper that was accepted at a top AI research venue, the International Conference on Learning Representations.
MGIE uses powerful multimodal large language models, aka MLLMs — AI models capable of processing text and images to increase capabilities in instruction-based image editing.
“The paper demonstrates the effectiveness of MGIE in improving automatic metrics and human evaluation, all while maintaining competitive inference efficiency,” the publication said.
What can MGIE do?
Here’s a simple example: If you type in “make the sky more blue,” MGIE translates and executes it as “increase the saturation of the sky region by 20%.”
VentureBeat described the following capabilities:
MGIE can produce concise and clear instructions that guide the editing process effectively. This not only improves the quality of the edits but also enhances the overall user experience.
MGIE can perform common Photoshop-style edits, such as cropping, resizing, rotating, flipping and adding filters. The model can also apply more advanced edits, such as changing the background, adding or removing objects, and blending images.
MGIE can optimize the overall quality of a photo, such as brightness, contrast, sharpness and color balance. The model can also apply artistic effects like sketching, painting and cartooning.
MGIE can edit specific regions or objects in an image, such as faces, eyes, hair, clothes and accessories. The model can also modify the attributes of these regions or objects, such as shape, size, color, texture and style.
Can I try MGIE?
You can see the MGIE open-source project on GitHub. To try it out with your own images, you can find a more-intuitive demo here.
As for when you can try the functionality within the Apple ecosystem — say, on an iPhone after you snap a shot — who knows? But given the promise here, and Apple’s longstanding commitment to user-friendly functionality, it may not be too long. After all, Apple CEO Tim Cook said just last week that his company’s AI revolution will arrive this year.
[ For more curated Apple news, check out the main news page here]
The post New Apple AI model can edit images per your instructions first appeared on www.cultofmac.com