Tokens are a big reason today's generative AI falls short | TechCrunch
techcrunch.com/2024/07/06/tokens-are-a-big-reason-todays-generative-ai-falls-shortRarely are digits tokenized consistently. Because they don’t really know what numbers are, tokenizers might treat “380” as one token, but represent “381” as a pair (“38” and “1”) — effectively destroying the relationships between digits and results in equations and formulas. The result is transformer confusion; a recent paper showed that models struggle to understand repetitive numerical patterns and context, particularly temporal data. (See: GPT-4 thinks 7,735 is greater than 7,926).
Doodling with the Mac’s command icon – alexwlchan
alexwlchan.net/2024/command-iconThe command key (⌘) has been a ubiquitious part of the Mac for over forty years. It was chosen by legendary icon designer Susan Kare, who picked it from a symbol dictionary – this shape was already being used in Sweden to highlight an interesting feature on a map.