Ever since I learned how to write, I’ve never been without a journal. There are currently six I use regularly on my desk, plus the Day One and Daylio apps on my phone. Each has its purpose, but I often wish for a single app or notebook to replace the rest. On paper, I’m the type of person Apple was appealing to when it announced the Journal app during this year’s WWDC keynote.
Instead, my reaction while watching the keynote was a knee-jerk Hell to the No. My hands were sweaty, my heart was racing, and I ended up walking away from my desk to take five deep breaths. The fact that Apple is Sherlocking Day One isn’t what set me off. It was the fact that Apple said it’ll use “on-device machine learning” to create customized journaling prompts based on your contacts, photos, music, workouts, location data, and even podcasts. Essentially, it was pitched as a riff on the Memories feature in the Photos app. An AI-powered scrapbook and digital diary rolled into one, if you will. And that’s worrying considering the AI behind Memories is... let’s just say it ain’t too bright.
The Journal app itself is not an inherently bad idea. According to Apple’s press release, it seems the intention is to help you cultivate gratitude by commemorating positive moments. There’s a growing body of research that suggests gratitude journaling may help boost your mental health — so it’s not as if the premise is completely woo-woo snake oil. My problem, based on the preliminary details Apple’s given us, is this proposed execution feels half-baked. People don’t only take photos of happy things or moments that spark joy. If your camera roll is like mine, it’s a jumble of happy, serene, infuriating, vain, mundane, and melancholy moments. It’s messy because life is messy. And if the Journal app truly takes a page from Apple’s Memories feature, there’s a good chance it’s going to tactlessly ambush you with memories you either don’t want or aren’t ready to see.
In the past year, Memories has made two slideshows of my mom’s funeral set to an upbeat pop tune. Once around the one-year anniversary of her death — and again the day before her birthday. I wish I could say it was a fluke, but it’s done the same with the last pictures I have of my dad and me together before he lost his battle with Parkinson’s and Alzheimer’s. Photos where he was skeletal, half-lucid, and unable to walk. And again with photos of Daisy, our beloved family dog, who died eight months ago. The first few times were unexpected and ruined my day or week.
If my parents were alive, they’d say this AI-powered feature lacks nunchi. Nunchi is one of those untranslatable words, but it’s a Korean term for quickly sussing out other people’s feelings and adjusting your behavior based largely on nonverbal context clues. It’s kind of like an amplified version of reading the room, mixed with mind reading and emotional intelligence. For example, my spouse might deduce I’m feeling sad because I’m looking at photos of my dead family while lying comatose in bed. Without commenting on it, they’ll get me a bowl of my favorite ice cream or suggest we go out for a walk. My iPhone would probably just assume I just really love my family (why else would I look at photos if not to feel happy?!) and suggest two new slideshows featuring them set to spunky tracks. My spouse has nunchi; my phone doesn’t.
So, forgive me for not feeling 100 percent confident in the Journal app’s machine learning. A part of me is terrified that when I download the iOS 17 beta, I’ll open the Journal app, and it’ll recommend that I write about an afternoon visit to Chuncheon, Korea, in June 2022. That it’ll pair D.O’s That’s Okay — a song I listen to whenever I miss my mom — with pictures of her gravestone overlooking picturesque rolling green hills. Or the photos of me and my relatives reunited for the first time since the covid pandemic, red-eyed and trying to put on a brave face for the relatives who couldn’t make it to the burial.
To be clear, I don’t expect nunchi from current AI technology. What I would like, however, is more granular control so I can tell it what not to do. The Memories feature gives you a semblance of it. While you can’t disable it entirely, you can disable notifications, tell it to feature a person less, or remove someone as a significant person in your Photos album. It’s not at all intuitive for someone in the throes of grief, depression, or anxiety, though. Besides, I don’t want to Eternal Sunshine of the Spotless Mind my mom, dad, and Daisy. I want to hold onto every bad, good, and in-between memory of them because we’ll never get to make new ones. I simply want a say in when I revisit those memories. I want Apple to give me the tools to tell its AI what types of photos are off-limits, under what circumstances, and for how long. Without this kind of customizability, the Journal app suggestions can’t be anything but half-baked.
It doesn’t help that all the Journal app screenshots Apple provided only feature Pollyanna-ish entries. I get why. No one wants to dwell on sad or unpleasant things. The problem is it gives the app a phony air — and inauthenticity is poison to any kind of journaling. It’d be surprising (though, in my opinion, comforting) if the example above said, “I dreamed about surfing last night. Usually when that happens I have a great day on the water, but today wasn’t my day. I had a hard time getting started. When Sarah picked me up around 5AM, the waves were choppy, and I just kept wiping out. Things mellowed out around 7AM, though, and hey, I got one or two good waves in. I wish the conditions had been better, but thems the breaks. But even if it wasn’t great, at least Sarah was with me.” The thing about cultivating gratitude is it doesn’t only come from commemorating or reliving happy times. In my experience, it comes from sitting with your worst moments and practicing how to reframe that ugliness into a lesson you can learn from.
I could be wrong. Right now, my concerns are solely based on my previous experiences with Memories and other “On This Day” features from social media and journaling apps. The Journal app isn’t out of beta yet, and it’s entirely possible that the final version will include tools that grant users a bigger say in how this app can best work for them. Maybe the iOS 17 developers anticipated this and taught the on-device machine-learning model how to interpret complex context clues. Maybe the AI is capable of recognizing my mom’s portrait next to a grave, cross-referencing that with photos of my mom’s casket, running That’s Okay’s bittersweet lyrics through Google Translate, and coming to the conclusion that this is probably not a suitable journaling prompt for weeks surrounding my mom’s birthday and during holidays like Mother’s Day. Perhaps, it’ll put two and two together from the above facts and realize I’ll be more likely to appreciate the prompt during Korean holidays like Chuseok or Parents’ Day. Maybe it’ll note that despite all evidence pointing toward death, I keep her as a pinned contact and in my favorites because she’s my mom, and she’ll always be number one in my phone.
Somehow, I doubt it.
I don’t doubt it because I think Apple’s willfully ignoring the problem. I doubt it because I’m not the first one to bring up this issue. As this Wired story beautifully illustrates, algorithms unintentionally repackage the worst parts of people’s lives every day in the name of personalization. Social media will suggest you friend your ex’s new squeeze simply because you both run in the same circles. Mothers who have miscarried are haunted by ads for baby products because the algorithms think the baby was born. I had to delete Timehop because it kept showing me pictures of anniversaries with an ex — I’ve long been over it, but why do I need an AI to prompt me to revisit those photos? If I want to remember those times fondly, I know where to find them. Apple, Meta, Google, and the zillions of other companies that use our data have yet to figure out how to compassionately curate memories. Some controls exist across varying platforms, but most lack the nunchi to be helpful when you need them most.
The fundamental problem here is that AI can’t read the digital room. Until it can, I have my doubts about how healing these AI-powered features can really be.