Whether you love it or hate it, AI is never away soon. In fact, AI is developing quite rapidly, and it is now in the palms of our hands with our smartphone, as Google, Samsung and even Apple have now fully adopted our AI future.
Although Apple was late in the game with Apple Intellation, the company prominently hypnotized it for the iPhone 16 launch in September, even if, surprisingly, it did not roll out until October with the iOS 18.1 update. The release schedule for Apple Intelligence confuses many consumers as to why they did not immediately have Apple Intelligence with the purchase of their iPhone 16, and it felt a big wrong way from Apple.
But now that we all have access to Apple Intelligence for the last few months of 2024, I have to say that it has not grown as an impact on my iPhone use as is originally thought.
AI's apple is corrected
There are many features that pack Apple in Apple Intelligence, but so far, I have only found some of them really useful in my daily use.
For one, the clean up tool has been very helpful when I need it. I have always been angry that before iOS 18, iOS users have to download some third-party photo editing app to get an object-removal tool, which usually also closes behind a pawall. Meanwhile, Google has a magic eraser tool since the Pixel 6 series, and Samsung has its own object eraser. But by iOS 18, Apple users were left in dust.
I do not need to use clean up every time when I want to share a photo, but it has been very useful when an image needs touchup. Removing a piece of garbage on the ground, electrical lines from a beautiful sky background, small scuffs and other flaws, and passing by various strangers – does a great work with these things.
First, if I need to edit a photo to remove something, I have to do it in Google Photos on my iPhone 16 Pro or even use my Pixel 9 Pro. But now when clean up is available, I do not have to avoid various apps or phones to complete the work.
Another apple intelligence tool I like is visual intelligence. This feature is exclusive to the iPhone 16 line because it requires camera control button, and for me, it has made the button worth using.
This is not a feature I use dozens of times every day, but I have faced some situations where it is convenient. For example, identifying plants or animals and translating text. I wonder that Apple took this long time to integrate such feature, as it is like Google Lens.
What is the apple wrong
When I received the iOS 18.2 update on my iPhone 16 Pro, I was excited to check more Apple Intelligence features. But apart from what I have already mentioned, the rest is not as exciting.
I already hate A Art in general, so I was not very excited about the image playground. However, since it is a new feature, I had to try it at least once. I tried to get Apple Intelligence to generate an AI image in various scenarios, perhaps to share on social media. But whatever result I got, I did not feel good, and I thought there was no real similarity for my image.
This kept giving me strange looking teeth in my smile, the hair that I had, and nothing looked like other flaws. I was not expecting an ideal picture, but I was hoping that I would find something that would be decent enough to share online – dozens of efforts, and I was not happy with any of them. I think my appearance does not work with Apple's AI art style? Whatever the reason, my experience with it has not been positive.
On the other hand, genmji is very fun to use. I often send emoji to my chat, so making some unique people that I cannot find with regular emojis are fun to mess up. And the fact is that they show your “recently used” emoji, which can mean rapid access to the future.
I also feel similar to AI tools for the text, although saranikaran is nifty, even if I do not use it too much. As a writer, someone else who enjoys writing in general, I am not a very big fan of any AI writing tool. In addition, if you have your own writing style, the AI-borne text will look out of place anyway, as it usually makes very difficult efforts, especially professional tones.
And while Siri got a little clever with iOS 18, it is still not good. It still does not seem to be able to handle multi-modal requests, so expected, it comes as soon as possible. But even with some basic things, Siri is easily confused. Compared to the competition, there is still a way. However, adding the support of the chat was a good idea.
much Ado About Nothing
Finally, I think the staggering rollout of Apple's Apple Intelligence caused more harm. Many people bought new iPhone 16 devices because they wanted these AI features, which Apple had marketed Heavy In shops, but it was not launched even with equipment. So everyone, involving themselves, continued to use the iPhone 16 and iPhone 16 Pro like their predecessors.
A month after the launch of the iPhone 16, Apple finally began to roll Apple Intelligence, but not all the facilities, some of them. We only in October in October 18.1 only clean -safi, writing, summary, priority message in mail and slightly better Siri. In December with iOS 18.2, we finally found image playground, genmji, visual intelligence and chatgate integration.
This is a slow rollout of AI features that Apple's biggest contestants have already offered for months. And at this point, different from some cool equipment, it seems that Apple Intellation is already losing its glow. Apple Intelligence has not affected my overall use of the iPhone 16 Pro, as I am still using it mainly like my iPhone 15 Pro like a year ago. This is not a bad thing for me, but it is not a great look even for the future of Apple Intelligence.