- 56Cohere Transcribe: Speech Recognition (cohere.com)
- 1611Axios compromised on NPM – Malicious versions drop remote access trojan (stepsecurity.io)
- 171Open source CAD in the browser (Solvespace) (solvespace.com)
- 22Show HN: Forkrun – NUMA-aware shell parallelizer (50×–400× faster than parallel) (github.com)
- 56GitHub Monaspace Case Study (lettermatic.com)
- 556Ollama is now powered by MLX on Apple Silicon in preview (ollama.com)
- 740Artemis II is not safe to fly (idlewords.com)
- 599Oracle slashes 30k jobs (rollingout.com)
- 1234Claude Code's source code has been leaked via a map file in their NPM registry (twitter.com)
- 94Combinators (tinyapl.rubenverg.com)
- 143Audio tapes reveal mass rule-breaking in Milgram's obedience experiments (psypost.org)
- 43RubyGems Fracture Incident Report (rubycentral.org)
- 86A Love Letter to 'Girl Games' (aftermath.site)
- 10From 300KB to 69KB per Token: How LLM Architectures Solve the KV Cache Problem (news.future-shock.ai)
- 17Good code will still win (greptile.com)
- 12Securing Elliptic Curve Cryptocurrencies Against Quantum Vulnerabilities [pdf] (quantumai.google)
- 204Microsoft: Copilot is for entertainment purposes only (microsoft.com)
- 5Accidentally created my first fork bomb with Claude Code (droppedasbaby.com)
- 190Tell HN: Chrome says "suspicious download" when trying to download yt-dlp ()
- 26Show HN: Loreline, narrative language transpiled via Haxe: C++/C#/JS/Java/Py/Lua (loreline.app)
- 148Claude Code users hitting usage limits 'way faster than expected' (theregister.com)
- 83What major works of literature were written after age of 85? 75? 65? (statmodeling.stat.columbia.edu)
- 645Fedware: Government apps that spy harder than the apps they ban (sambent.com)
- 11Ask HN: Academic study on AI's impact on software development – want to join? ()
ask - 46Multiple Sclerosis (subfictional.com)
- 420Universal Claude.md – cut Claude output tokens (github.com)
- RamAIn (YC W26) Is Hiring (ycombinator.com)
job - 10Scotty: A beautiful SSH task runner (freek.dev)
- 264Google's 200M-parameter time-series foundation model with 16k context (github.com)
- 692Do your own writing (alexhwoods.com)