My wife was into colour analysis long before I was. For the past year it's been her thing — a genuinely considered approach to dressing that I admired from a distance but hadn't got around to trying myself. Then ChatGPT Image 2.0 dropped, a prompt went viral showing how to get a full seasonal colour analysis from a single selfie, and suddenly I was on the bandwagon too.
A Trend That Never Really Went Away
Seasonal colour analysis isn't new. It was popularised in the early 1980s by Carole Jackson's book Color Me Beautiful, which sorted people into four seasonal types — Spring, Summer, Autumn, Winter — based on their skin tone, hair colour, and eye colour. The concept: wear colours that harmonise with your natural colouring and you'll look more vibrant, polished, alive. Clash with them and you'll look washed out or jarring, even if you can't quite put your finger on why.
The idea had a big moment in the 80s and 90s, went relatively quiet through the 2000s, and has now roared back to life across TikTok, Instagram Reels, and YouTube. The hashtag #coloranalysis has racked up close to 750 million views. Videos of people "getting their colours done" — draped in fabric swatches by a professional consultant, with dramatic before-and-after reveals — are everywhere. According to Marketplace, the global image consulting market is now worth around $4 billion, and it's growing.
What changed recently is that ChatGPT made the whole thing accessible in about 30 seconds.
Upload a Photo. Get a Palette.
The process is genuinely simple. Upload a clear, well-lit photo of your face, paste in a prompt asking for a personal colour analysis, and ChatGPT returns a full seasonal breakdown — your skin undertone, hair and eye classification, contrast level, a curated palette of colours that work for you, and a list of colours to avoid.
No appointment. No $200 consultation fee. No awkward moment where someone drapes mauve chiffon around your shoulders while you try to look decisive.
I uploaded my photo and was diagnosed as a Cool Summer — cool, muted, soft. The analysis read my salt-and-pepper hair, blue-grey eyes, and cool-neutral skin tone, then generated a specific palette: slate blue, navy, charcoal, muted teal, eucalyptus, dusty rose, burgundy, cool grey, soft white. Best metals: silver and gunmetal. Colours to avoid: bright red, orange, mustard, lime green, pure white, black.
I'll be honest — looking at the side-by-side comparisons it produced, the recommended colours genuinely do look better on me. Navy works. Slate blue works. The bright red version looked a bit like I was about to referee something.
Then It Went Shopping With Me
This is where the experience got genuinely fun. I asked ChatGPT to find me specific items in my palette — shirts, jackets, knitwear — and it came back with product recommendations from actual retailers, with direct purchase links. It had done the research: filtering by colour, style, and occasion, pointing me to specific pieces across a range of stores and price points.
The whole research journey felt less like shopping and more like someone had done the tedious part for me. I didn't buy everything it suggested, but I bookmarked a lot of it — and I've since made a couple of purchases I'm genuinely pleased with. My wardrobe now has opinions. They are better opinions than mine were.
My wife is thrilled. A year of colour evangelism, vindicated in an afternoon.
But What Are You Actually Handing Over?
Here's where it's worth slowing down. All of this is powered by uploading a photo of your face to OpenAI's servers. And that raises a question worth sitting with: is this genuinely about helping you dress better — or is there something else going on?
OpenAI has been publicly cautious about facial recognition. They've said they're not ready to deploy it broadly, citing legal concerns in jurisdictions that require consent for the use of biometric data. But colour analysis requires detailed analysis of your facial features — skin undertone, eye colour, hair contrast. That analysis draws on the same underlying capability as facial recognition, even if the output is a palette rather than an identity.
OpenAI's privacy policy indicates that uploaded content may be used to improve their services. Photos can also carry hidden metadata — location, device information, timestamps — that reveal more than you intended. And while OpenAI released a Privacy Filter tool in April 2026 that strips personally identifiable information from text, images are a different matter entirely.
The charitable reading: this is a genuinely useful tool that democratises access to something that previously cost hundreds of dollars and required a specialist appointment. The more cynical reading: it's an elegant on-ramp to collecting facial data at scale from people who are happily opting in because the output is flattering and fun.
The truth is probably somewhere in between — and that's not unusual for AI tools. The analysis is useful. The shopping recommendations are legitimately helpful. But it's worth knowing what you're handing over when you upload that selfie, and making the choice consciously rather than because a colour wheel was too pretty to resist.
As for me: I'm wearing more navy. I'm genuinely shopping better. And apparently I've been a Cool Summer all along — I just needed an AI to tell me.