0
Applied AI·April 12, 2026·1 min read

A journalist recounts how he used ChatGPT to develop a fitness plan to prepare for the Paris Marathon, resulting in a 20-pound weight loss and faster race times (Derek Wallbank/Bloomberg)

Share

Consumer-grade ChatGPT is already functioning as a personalized coach — structured planning, accountability, and iteration are now default features available to anyone with a prompt. If your product relies on templated plans or generic guidance — fitness, nutrition, education, finance — assume the baseline user experience is now an AI copilot and build for differentiation above that line.

Applied AI

Analysts and researchers say Google's TurboQuant compression algorithm to make LLMs more efficient is more likely to expand memory chip demand than reduce it (Daniel Tudor/Financial Times)

Compression that makes LLMs cheaper to run just resets the constraint — you get larger context windows and more agents, not fewer HBM orders. If you’re planning infra or capex on the assumption that efficiency gains will ease memory pressure, update the model: demand for high-bandwidth DRAM and packaging is on a multi-year up-and-to-the-right path.

Applied AI

Takeaways from HumanX, one of the AI industry's main events: Claude Code dominated the conversation, while some execs noted China's lead in open-weight models (Ashley Capoot/CNBC)

Claude Code owning the hallway talk at a 6,500‑person exec conference means 'AI that writes and maintains your codebase' is now a board topic, not a dev toy. At the same time, execs openly flagging China’s lead in open‑weight models is a warning shot—if your AI stack is 100% closed and US‑centric, your competitive set is already more global than your architecture.