- 1The Irreducible Skill (vinniefalco.com)
- 87Windows API is Successful Cross-Platform API (2024) (retrocoding.net)
- 3After Mythos, Nobody Is Safe from Cybersecurity Threats (nytimes.com)
- 2Postfix 1998, Dovecot 2002, Roundcube 2008: Why Email Stack Frozen 25 Years? ()
- 1Fun, open-source AI transparency project (github.com)
- 40AI, Intimacy, and the Data You Never Meant to Share (fshot.org)
- 1What the Benchmark Cannot See (exoskeleton.ghost.io)
- 167Open source does not imply open community (blog.feld.me)
- 31Elon Musk gets an apology from California regulators as a SpaceX lawsuit settled (apnews.com)
- 2Prompt Engineering Is Permanent (yiblet.com)
- 28Year of the Linux Laptop: Omarchy on XPS (dell.com)
- 1A PQC Almanac (2025) [pdf] (downloads.bouncycastle.org)
- 1Comparing the best open source TranslateGemma projects (metalglot.com)
- 1The AI workflow I use to build apps (juanmanuelalloron.com)
- 1Show HN: Pantheon – A Path 1 PlayStation 2 game engine (VU1 / EE / DMA) (github.com)
- 1Can a Hand‑Built EV Change Mobility in East Africa? [video] (youtube.com)
- 3Farewell, Jeeves: Ask.com shuts down (techcrunch.com)
- 13Original Apollo 11 code open-sourced by NASA (tomshardware.com)
- 1Too Dark? Too Bright? Scientists Need Your Help to Make Reading Easier (news.ncsu.edu)
- 1Intelligence Buying Intelligence (stevekrouse.com)
- 2Running Shoes Have Evolved – From Ancient Greece to Record-Breaking Marathons (nytimes.com)
- 3Why do crabs walk sideways? Scientists trace it back 200M years (sciencedaily.com)
- 2US Navy signs deal with AI firm for training underwater drones (tomshardware.com)
- 2New Netflix documentary reexamines Winnie Mandela's divisive legacy (npr.org)
- 2Reaching for the stars: enduring symbols of Soviet science in pictures (theguardian.com)
- 2I touched a ZX Spectrum for the first time in decades (theguardian.com)
- 1The Wayfinders (longreads.com)
- 1How to prepare to be a startup founder (2021) (letterstoanewdeveloper.com)
- 51Care homes and hotels in Japan shut as expansion strategy unravels (newsonjapan.com)
- 2Training language models to be warm can reduce accuracy and increase sycophancy (nature.com)