📰 NewsMonitor

Steve Jobs in 2007, on Apple’s Pursuit of PC Market Share: ‘We Just Can’t Ship Junk’

In August 2007, Apple held a Mac event in the Infinite Loop Town Hall auditorium. New iMacs, iLife ’08 (major updates to iPhoto and iMovie), and iWork ’08 (including the debut of Numbers 1.0). Back then, believe it or not, at the end of these Town Hall events, Apple executives would sit on stools and take questions from the media. For this one, Steve Jobs was flanked by Tim Cook and Phil Schiller. Molly Wood, then at CNet, asked, “And so, I guess once and for all, is it your goal to overtake the PC in market share?” The audience — along with Cook, Jobs, and Schiller — chuckled. And then Jobs answered. You should watch the video — it’s just two minutes — but here’s what he said: I can tell you what our goal is. Our goal is to make the best personal computers in the world and to make products we are proud to sell and would recommend to our family and friends. And we want to do that at the lowest prices we can. But I have to tell you, there’s some stuff in our industry that we wouldn’t be proud to ship, that we wouldn’t be proud to recommend to our family and friends. And we can’t do it. We just can’t ship junk. So there are thresholds that we can’t cross because of who we are. But we want to make the best personal computers in the industry. And we think there’s a very significant slice of the industry that wants that too. And what you’ll find is our products are usually not premium priced. You go and price out our competitors’ products, and you add the features that you have to add to make them useful, and you’ll find in some cases they are more expensive than our products. The difference is we don’t offer stripped-down lousy products. We just don’t offer categories of products like that. But if you move those aside and compare us with our competitors, I think we compare pretty favorably. And a lot of people have have been doing that and saying that now for the last 18 months. Steve Jobs would have loved the MacBook Neo. Everything about it, right down to the fact that Apple is responsible for the silicon.  ★

Package Manager Magic Files

A follow-up to my post on git’s magic files. Most package managers have a manifest and a lockfile, and most developers stop there. But across the ecosystems I track on ecosyste.ms, package managers check for dozens of other files beyond the manifest and lockfile, controlling where packages come from, what gets published, how versions resolve, and what code runs during installation. These files tend to be poorly documented, inconsistently named, and useful once you know they exist. Configuration Registry URLs, auth tokens, proxy settings, cache behavior. Every package manager has a way to configure these, and they almost always live outside the manifest. .npmrc is an INI-format file that can live at the project root, in your home directory, or globally. npm and pnpm both read it. It controls the registry URL, auth tokens for private registries, proxy settings, and dozens of install behaviors like legacy-peer-deps and engine-strict. There’s a footgun here: if an .npmrc ends up inside a published package tarball, npm will silently apply those settings when someone installs your package in their project. Less well known are the shell, script-shell, and git settings, which point at arbitrary executables that npm will invoke during lifecycle scripts and git operations. Research by Snyk and Cider Security showed these as viable attack vectors: a malicious .npmrc committed to a repository can redirect script execution without touching package.json at all. .yarnrc.yml replaced the INI format of Yarn Classic’s .yarnrc. It configures which linker to use (PnP, pnpm-style, or traditional node_modules), registry auth, and the pnpMode setting that controls how strictly Yarn enforces its dependency resolution. The yarnPath setting is security-sensitive: it points to a JavaScript file that Yarn will execute as its own binary, so a malicious .yarnrc.yml can hijack the entire package manager. bunfig.toml is Bun’s config file, covering registry config, install behavior, and the test runner all in one TOML file. pip.conf on Unix and pip.ini on Windows, searched at ~/.config/pip/pip.conf, ~/.pip/pip.conf, and /etc/pip.conf. The PIP_CONFIG_FILE environment variable can override all of these or point to /dev/null to disable config entirely. Malformed config files are silently ignored rather than producing errors, so you can have broken configuration for months without realizing it. uv.toml or the [tool.uv] section in pyproject.toml. .bundle/config stores Bundler’s per-project config, created by bundle config set. RubyGems has its own .gemrc file, which Bundler deliberately ignores because it calls Gem::Installer directly. The credentials file at ~/.gem/credentials must have 0600 permissions or RubyGems refuses to read it. .cargo/config.toml is the most interesting of the bunch because it’s hierarchical: Cargo walks up the directory tree merging config files as it goes, so you can have workspace-level settings that individual crates inherit. It controls registries, proxy settings, build targets, and command aliases. A backwards-compatibility quirk means Cargo still reads .cargo/config without the .toml extension, and if both files exist, the extensionless one wins, which is an easy way to have a stale config file shadow your actual settings. .condarc is searched at six different paths from /etc/conda/.condarc through ~/.condarc to $CONDA_PREFIX/.condarc, plus .d/ directories at each level for drop-in fragments, and you can put one inside a specific conda environment to configure just that environment. Every setting also has a CONDA_UPPER_SNAKE_CASE environment variable equivalent. ~/.m2/settings.xml holds Maven’s repositories and credentials, plus ~/.m2/settings-security.xml stores the master password used to decrypt encrypted passwords in the main settings file. Most developers don’t know settings-security.xml exists. .mvn/maven.config holds per-project default CLI arguments (since Maven 3.9.0, each arg must be on its own line), and .mvn/jvm.config sets JVM options. gradle.properties lives at both project and user level. Init scripts in ~/.gradle/init.d/ run before every build, which is how enterprises inject internal repository configurations across all projects. auth.json keeps Composer credentials separate from composer.json (per-project or at ~/.composer/auth.json) so you can gitignore it. nuget.config is XML searched hierarchically from the project directory up to the drive root, then at the user level. Like pip, malformed XML is silently ignored. deno.json is both configuration and import map, controlling formatting, linting, test config, lock file behavior, and dependency imports in a single file. If you have a separate import_map.json, Deno reads that too, though the trend is toward folding everything into deno.json. Brewfile is generated by brew bundle dump and consumed by brew bundle install, turning a system package manager into something you can commit to a repository. Brewfile.lock.json pins exact versions and records the macOS version the bundle was resolved on. I built the original version of brew bundle years ago, and it’s grown well beyond what I first put together. Publishing What gets included or excluded when you publish a package. People accidentally ship secrets and accidentally omit files they need in roughly equal measure. .npmignore works like .gitignore but for npm pack and npm publish. If it doesn’t exist, npm falls back to .gitignore. But if you create an .npmignore, it completely replaces .gitignore for packaging purposes, they are not merged. This means patterns you had in .gitignore to keep .env files or credentials out of version control no longer protect you from publishing them. npm-shrinkwrap.json is identical in format to package-lock.json but gets included inside published tarballs. It’s the only npm lock file that travels with a published package, intended for CLI tools and daemons that want locked transitive dependencies for their consumers rather than letting the consumer’s resolver pick versions. MANIFEST.in controls what goes into a Python source distribution using directives like include, exclude, recursive-include, graft, and prune. It only matters for sdists, not wheels. .helmignore controls what gets excluded when packaging a Helm chart, following .gitignore syntax. Workspaces Monorepo topology and inter-package relationships. The JavaScript ecosystem has the most options here, which probably says something about the JavaScript ecosystem. pnpm-workspace.yaml defines workspace membership with a packages: field. Where npm and Yarn put this in a workspaces field in package.json, pnpm requires a separate file. lerna.json handles versioning and publishing across workspace packages, though Lerna’s remaining value is mostly the publishing workflow (changelogs, version bumps). nx.json and turbo.json configure task pipelines and caching for Nx and Turborepo monorepo builds. go.work (added in Go 1.18) lists use directives pointing to local module directories so you can develop across multiple modules without replace directives scattered through your go.mod files. It generates a companion go.work.sum checksum file. settings.gradle / settings.gradle.kts declares all Gradle subprojects with include statements and is mandatory for multi-project builds. Maven uses in a parent pom.xml. Overrides and resolution When a transitive dependency has a bug or a security vulnerability and you can’t wait for every package in the chain to release an update, override files let you force a specific version or patch a package in place. Most developers don’t know these mechanisms exist and spend hours working around dependency conflicts that a single config line would fix. In the JavaScript ecosystem, npm has overrides, Yarn has resolutions, and pnpm has pnpm.overrides, all fields in package.json that force specific versions of transitive dependencies. Yarn Berry and pnpm also support patching dependencies in place: Yarn’s patch: protocol stores diff files in .yarn/patches/, and pnpm’s pnpm.patchedDependencies references diffs in a patches/ directory, built into the workflow via pnpm patch and pnpm patch-commit. .pnpmfile.cjs goes further than any of these: the readPackage hook lets you programmatically rewrite any package’s package.json at install time, and afterAllResolved can modify the lockfile after resolution. It’s the nuclear option for dependency problems, living next to the lockfile and running before anything gets installed. constraints.txt is used via pip install -c constraints.txt to pin versions of packages without triggering their installation. It’s been available since pip 7.1, yet almost nobody uses it despite being exactly what large organizations need for base image management and reproducible environments. uv has override-dependencies in [tool.uv] for the same purpose with better ergonomics. Directory.Packages.props is worth knowing about if you work in .NET. NuGet’s Central Package Management (6.4+) lets you put a single file at the repo root that sets for all projects, so individual .csproj files use without version numbers. It eliminates version drift across large solutions and is one of the better implementations of centralized version management I’ve seen. Directory.Build.props can inject shared package references into all projects too. gradle/libs.versions.toml is Gradle’s version catalog, with sections for [versions], [libraries], [bundles], and [plugins], referenced in build files as typed accessors like libs.someLibrary. cabal.project supports constraints: stanzas for pinning transitive Haskell deps, and cabal.project.freeze locks everything down. Vendoring and integrity Beyond lockfiles, some package managers support vendoring all dependency source code into the repository and tracking its integrity. .cargo-checksum.json lives in each vendored crate directory after running cargo vendor, containing the SHA256 of the original tarball and per-file checksums. If you need to patch vendored source (which you sometimes do for air-gapped builds), setting "files": {} in the checksum file disables integrity checking for that crate, which is the known workaround and also completely defeats the purpose of the checksums. GONOSUMCHECK and GONOSUMDB are Go environment variables that bypass the checksum database for private modules, which is how enterprises use Go modules without leaking internal module paths to Google’s infrastructure. Go’s vendor/modules.txt (generated by go mod vendor) lists vendored packages and their module versions, and the Go toolchain verifies it matches go.mod. If your repo has a vendor/ directory and go.mod specifies Go 1.14+, vendoring is automatically enabled without any flag, which surprises people who have a stale vendor directory they forgot about. .yarn/cache/ and .pnp.cjs make up Yarn Berry’s zero-install setup: compressed zip archives of every dependency and the Plug’n’Play loader mapping package names to zip locations, both committed to version control. After git clone, the project works without running yarn install, though your repository size will grow substantially. .terraform.lock.hcl records Terraform provider version locks with platform-specific hashes, which means a lock file generated on macOS may fail verification on Linux CI unless you’ve run terraform providers lock for multiple platforms. Hooks and scripts Lifecycle scripts that run during install, build, or publish. Supply chain attacks often hide here, but so does a lot of useful automation. .pnpmfile.cjs isn’t just for overrides. pnpm’s hooks API includes readPackage for rewriting manifests, afterAllResolved for modifying the resolved lockfile, and custom fetchers for alternative package fetching logic. .yarn/plugins/ contains committed plugin files that hook into Yarn Berry’s lifecycle. .yarn/sdks/ holds editor integration files generated by @yarnpkg/sdks to make PnP work with IDEs. .mvn/extensions.xml loads Maven extensions that hook into the build lifecycle before anything else runs. Gradle’s init scripts in ~/.gradle/init.d/ execute before every build and can inject repositories, apply plugins, or configure all projects. Cargo’s build.rs is a build script that runs before compilation, generating code, linking native libraries, or setting cfg flags. Go’s //go:generate directives in source files run via go generate for code generation, though they’re not part of the build itself. I’ll keep updating this post as I find more. If you know of package manager magic files I’ve missed or have corrections, reach out on Mastodon or submit a pull request on GitHub.

AI And The Ship of Theseus

Because code gets cheaper and cheaper to write, this includes re-implementations. I mentioned recently that I had an AI port one of my libraries to another language and it ended up choosing a different design for that implementation. In many ways, the functionality was the same, but the path it took to get there was different. The way that port worked was by going via the test suite. Something related, but different, happened with chardet. The current maintainer reimplemented it from scratch by only pointing it to the API and the test suite. The motivation: enabling relicensing from LGPL to MIT. I personally have a horse in the race here because I too wanted chardet to be under a non-GPL license for many years. So consider me a very biased person in that regard. Unsurprisingly, that new implementation caused a stir. In particular, Mark Pilgrim, the original author of the library, objects to the new implementation and considers it a derived work. The new maintainer, who has maintained it for the last 12 years, considers it a new work and instructs his coding agent to do precisely that. According to author, validating with JPlag, the new implementation is distinct. If you actually consider how it works, that’s not too surprising. It’s significantly faster than the original implementation, supports multiple cores and uses a fundamentally different design. What I think is more interesting about this question is the consequences of where we are. Copyleft code like the GPL heavily depends on copyrights and friction to enforce it. But because it’s fundamentally in the open, with or without tests, you can trivially rewrite it these days. I myself have been intending to do this for a little while now with some other GPL libraries. In particular I started a re-implementation of readline a while ago for similar reasons, because of its GPL license. There is an obvious moral question here, but that isn’t necessarily what I’m interested in. For all the GPL software that might re-emerge as MIT software, so might be proprietary abandonware. For me personally, what is more interesting is that we might not even be able to copyright these creations at all. A court still might rule that all AI-generated code is in the public domain, because there was not enough human input in it. That’s quite possible, though probably not very likely. But this all causes some interesting new developments we are not necessarily ready for. Vercel, for instance, happily re-implemented bash with Clankers but got visibly upset when someone re-implemented Next.js in the same way. There are huge consequences to this. When the cost of generating code goes down that much, and we can re-implement it from test suites alone, what does that mean for the future of software? Will we see a lot of software re-emerging under more permissive licenses? Will we see a lot of proprietary software re-emerging as open source? Will we see a lot of software re-emerging as proprietary? It’s a new world and we have very little idea of how to navigate it. In the interim we will have some fights about copyrights but I have the feeling very few of those will go to court, because everyone involved will actually be somewhat scared of setting a precedent. In the GPL case, though, I think it warms up some old fights about copyleft vs permissive licenses that we have not seen in a long time. It probably does not feel great to have one’s work rewritten with a Clanker and one’s authorship eradicated. Unlike the Ship of Theseus, though, this seems more clear-cut: if you throw away all code and start from scratch, even if the end result behaves the same, it’s a new ship. It only continues to carry the name. Which may be another argument for why authors should hold on to trademarks rather than rely on licenses and contract law. I personally think all of this is exciting. I’m a strong supporter of putting things in the open with as little license enforcement as possible. I think society is better off when we share, and I consider the GPL to run against that spirit by restricting what can be done with it. This development plays into my worldview. I understand, though, that not everyone shares that view, and I expect more fights over the emergence of slopforks as a result. After all, it combines two very heated topics, licensing and AI, in the worst possible way.

★ Thoughts and Observations on the MacBook Neo

$599. Not a piece of junk. That’s not a marketing slogan from Apple for the new MacBook Neo. But it could be. And it is the underlying message of the product. For a few years now, Apple has quietly dabbled with the sub-$1,000 laptop market, by selling the base configuration of the M1 MacBook Air — a machine that debuted in November 2020 — at retailers like Walmart for under $700. But dabbling is the right word. Apple has never ventured under the magic $999 price point for a MacBook available in its own stores. As of today, they’re not just in the sub-$1,000 laptop market, they’re going in hard. The MacBook Neo is a very compelling $600 laptop, and for just $100 more, you get a configuration with Touch ID and double the storage (512 GB instead of 256). You can argue that all MacBooks should have Touch ID. My first answer to that is “$599”. My second answer is “education”. Touch ID doesn’t really make sense for laptops shared by kids in a school. And with Apple’s $100 education pricing discount, the base MacBook Neo, at $499, is half the price of the base M5 MacBook Air ($1099 retail, $999 education). Half the price. I’m writing this from Apple’s hands-on “experience” in New York, amongst what I’d estimate as a few hundred members of the media. It’s a pretty big event, and a very big space inside some sort of empty warehouse on the western edge of Chelsea. Before playing the four-minute Neo introduction video (which you should watch — it’s embedded in Apple’s Newsroom post), John Ternus took the stage to address the audience. He emphasized that the Mac user base continues to grow, because “nearly half of Mac buyers are new to the platform”. Ternus didn’t say the following aloud, but Apple clearly knows what has kept a lot of would-be switchers from switching, and it’s the price. The Mac Mini is great, but normal people only buy laptops, and aside from the aforementioned dabbling with the five-year-old M1 MacBook Air, Apple just hasn’t ventured under $999. “We don’t ship junk,” Steve Jobs said back in 2007. It’s not that Apple never noticed the demand for laptops in the $500–700 range. It’s that they didn’t see how to make one that wasn’t junk. Now they have. And the PC world should take note. One of my briefings today included a side-by-side comparison between a MacBook Neo and an HP 14-inch laptop “in the same price category”. It was something like this one, with an Intel Core 5 chip, which costs $550. The HP’s screen sucks (very dim, way lower resolution), the speakers suck, the keyboard sucks, and the trackpad sucks. It’s a thick, heavy, plasticky piece of junk. I didn’t put my nose to it, but I wouldn’t be surprised if it smells bad. The MacBook Neo looks and feels every bit like a MacBook. Solid aluminum. Good keyboard (no backlighting, but supposedly the same mechanism as in other post-2019 MacBooks — felt great in my quick testing). Good trackpad (no Force Touch — it actually physically clicks, but you can click anywhere, not just the bottom). Good bright display (500 nits max, same as the MacBook Air). Surprisingly good speakers, in a new side-firing configuration. Without even turning either laptop on, you can just see and feel that the MacBook Neo is a vastly superior device. And when you do turn them on, you see the vast difference in display quality and hear the vast difference in speaker quality. And you get MacOS, not Windows, which, even with Tahoe, remains the quintessential glass of ice water in hell for the computer industry. I came into today’s event experience expecting a starting price of $799 for the Neo — $300 less than the new $1,099 price for the base M5 MacBook Air (which, in defense of that price, starts with 512 GB storage). $599 is a fucking statement. Apple is coming after this market. I think they’re going to sell a zillion of these things, and “almost half” of new Mac buyers being new to the platform is going to become “more than half”. The MacBook Neo is not a footnote or hobby, or a pricing stunt to get people in the door before upselling them to a MacBook Air. It’s the first major new Mac aimed at the consumer market in the Apple Silicon era. It’s meant to make a dent — perhaps a minuscule dent in the universe, but a big dent in the Mac’s share of the overall PC market. Miscellaneous Observations It’s worth noting that the Neo is aptly named. It really is altogether new. In that way it’s the opposite of the five-year-old M1 MacBook Air that Apple had been selling through retailers like Walmart and Amazon. Rather than selling something old for a lower price, they’ve designed and engineered something new from the ground up to launch at a lower price. It’s an all-new trackpad. It’s a good but different display than the Air’s — slightly smaller (13.0 inches vs. 13.6) and supporting only the sRGB color gamut, not P3. If you know the difference between sRGB and P3, the Neo is not the MacBook you want. What Neo buyers are going to notice is that the display looks good and is just as bright as the Air’s — and it looks way better, way sharper, and way brighter than the criminally ugly displays on PC laptops in this price range. Even the Apple logo on the back of the display lid is different. Rather than make it polished and shiny, it’s simply embossed. Save a few bucks here, a few bucks there, and you eventually grind your way to a new MacBook that deserves the name “MacBook” but starts at just $600. But of course there are trade-offs. You can use Apple’s Compare page to see the differences between the Neo and Air (and, for kicks, the 2020 M1 Air that until now was still being sold at Walmart). Even better, over at 512 Pixels Stephen Hackett has assembled a concise list of the differences between the MacBook Neo and MacBook Air. All of these things matter, but none of these things are dealbreakers for a $500-700 MacBook. These trade-offs are extremely well-considered on Apple’s part. I’ll call out one item from Hackett’s 17-item list in particular: One of the two USB-C ports is limited to USB 2.0 speeds of just 480 Mb/s. On the one hand, this stinks. It just does. The two ports look exactly the same — and neither is labeled in any way — but they’re different. But on the other hand, the Neo is the first product with an A-series chip that Apple has ever made that supports two USB ports.1 It was, I am reliably informed by Apple product marketing folks, a significant engineering achievement to get a second USB port at all on the MacBook Neo while basing it on the A18 Pro SoC. And while the ports aren’t labeled, if you plug an external display into the “wrong” port, you’ll get an on-screen notification suggesting you plug it into the other port. That this second USB-C port is USB 2.0 is not great, but it is fine. Other notes: I think the “fun-ness” of the Neo colors was overstated in the rumor mill. But the “blush” color is definitely pink, “citrus” is definitely yellow, and “indigo” is definitely blue. No confusing any of them with shades of gray.

The keyboards are color-matched. At a glance it’s easy to think the keyboards are all white, but only on the silver Neo are the key caps actually white. The others are all slightly tinted to match the color of the case. Nice!

8 GB of RAM is not a lot, but with Apple Silicon it really is enough for typical consumer productivity apps. (If they update the Neo annually and next year’s model gets the A19 Pro, it will move not to 16 GB of RAM but 12 GB.)

It’s an interesting coincidence that the base models for the Neo and iPhone 17e both cost $600. For $1,200 you can buy a new iPhone and a new MacBook for just $100 more than the price of the base model M5 MacBook Air. (And the iPhone 17e is the one with the faster CPU.)

To consider the spread of Apple’s market segmentation, and how the Neo expands it, think about the fact that on the premium side, the 13-inch iPad Pro Magic Keyboard costs $350. That’s a keyboard with a trackpad and a hinge. You can now buy a whole damn 13-inch MacBook Neo — which includes a keyboard, trackpad, and hinge, along with a display and speakers and a whole Macintosh computer — for just $250 more.

Perhaps the closest Apple had ever come to an A-series-chip product with two ports was the original iPad from 2010, which in late prototypes had two 30-pin connectors — one on the long side and another on the short side — so that you could orient it either way in the original iPad keyboard dock. ↩︎

Studio Display vs. Studio Display XDR

Not sure if this page was there yesterday, but the main “Displays” page at Apple’s website is a spec-by-spec comparison between the regular and XDR models. Nice.\n ★

Compatibility Notes on the New Studio Displays

Juli Clover, at MacRumors, notes that neither the new Studio Display nor the Studio Display XDR are compatible with Intel-based Macs. (I’m curious why.) Also, in a separate report, she notes that Macs with any M1 chip, or the base M2 or M3, are only able to drive the Studio Display XDR at 60 Hz. You need a Pro or better M2/M3, or any M4 or M5 chip, to drive it at 120 Hz.  ★

‘In Other Words, Batman Has Become Superman and Robin Has Become Batman’

Jason Snell, Six Colors: Here’s the backstory: With every new generation of Apple’s Mac-series processors, I’ve gotten the impression from Apple execs that they’ve been a little frustrated with the perception that their “lesser” efficiency cores were weak sauce. I’ve lost count of the number of briefings and conversations I’ve had where they’ve had to go out of their way to point out that, actually, the lesser cores on an M-series chip are quite fast on their own, in addition to being very good at saving power! Clearly they’ve had enough of that, so they’re changing how those cores are marketed to emphasize their performance, rather than their efficiency.  ★

Package Managers Need to Cool Down

This post was requested by Seth Larson, who asked if I could do a breakdown of dependency cooldowns across package managers. His framing: all tools should support a globally-configurable exclude-newer-than= like 7d, to bring the response times for autonomous exploitation back into the realm of human intervention. When an attacker compromises a maintainer’s credentials or takes over a dormant package, they publish a malicious version and wait for automated tooling to pull it into thousands of projects before anyone notices. William Woodruff made the case for dependency cooldowns in November 2025, then followed up with a redux a month later: don’t install a package version until it’s been on the registry for some minimum period, giving the community and security vendors time to flag problems before your build pulls them in. Of the ten supply chain attacks he examined, eight had windows of opportunity under a week, so even a modest cooldown of seven days would have blocked most of them from reaching end users. The concept goes by different names depending on the tool (cooldown, minimumReleaseAge, stabilityDays, exclude-newer) and implementations vary in whether they use rolling durations or absolute timestamps, whether they cover transitive dependencies or just direct ones, and whether security updates are exempt. But the adoption over the past year has been remarkably fast. JavaScript The JavaScript ecosystem moved on this faster than anyone else, with pnpm shipping minimumReleaseAge in version 10.16 in September 2025, covering both direct and transitive dependencies with a minimumReleaseAgeExclude list for packages you trust enough to skip. Yarn shipped npmMinimalAgeGate in version 4.10.0 the same month (also in minutes, with npmPreapprovedPackages for exemptions), then Bun added minimumReleaseAge in version 1.3 in October 2025 via bunfig.toml. npm took longer but shipped min-release-age in version 11.10.0 in February 2026. Deno has --minimum-dependency-age for deno update and deno outdated. Five package managers in six months, which I can’t think of a precedent for in terms of coordinated feature adoption across competing tools. Python uv has had --exclude-newer for absolute timestamps since early on and added relative duration support (e.g. 1 week, 30 days) in version 0.9.17 in December 2025, along with per-package overrides via exclude-newer-package. pip shipped --uploaded-prior-to in version 26.0 in January 2026, though it only accepts absolute timestamps and there’s an open issue about adding relative duration support. Ruby Bundler and RubyGems have no native cooldown support, but gem.coop, a community-run gem server, launched a cooldowns beta that enforces a 48-hour delay on newly published gems served from a separate endpoint. Pushing the cooldown to the index level rather than the client is interesting because any Bundler user pointed at the gem.coop endpoint gets cooldowns without changing their tooling or workflow at all. Rust, Go, PHP, .NET These ecosystems are still in the discussion phase. Cargo has an open issue, and in the meantime there’s cargo-cooldown, a third-party wrapper that enforces a configurable cooldown window on developer machines as a proof-of-concept (CI pipelines are expected to keep using plain Cargo against committed lockfiles). Go has an open proposal for go get and go mod tidy, Composer has two open issues, and NuGet has an open issue though .NET projects using Dependabot already get cooldowns on the update bot side since Dependabot expanded NuGet support in July 2025. Dependency update tools Renovate has had minimumReleaseAge (originally called stabilityDays) for years, long before the rest of the ecosystem caught on, adding a “pending” status check to update branches until the configured time has passed. Mend Renovate 42 went a step further and made a 3-day minimum release age the default for npm packages in their “best practices” config via the security:minimumReleaseAgeNpm preset, making cooldowns opt-out rather than opt-in for their users. Dependabot shipped cooldowns in July 2025 with a cooldown block in dependabot.yml supporting default-days and per-semver-level overrides (semver-major-days, semver-minor-days, semver-patch-days), with security updates bypassing the cooldown. Snyk takes the most aggressive stance with a built-in non-configurable 21-day cooldown on automatic upgrade PRs. npm-check-updates added a --cooldown parameter that accepts duration suffixes like 7d or 12h. Checking your config zizmor added a dependabot-cooldown audit rule in version 1.15.0 that flags Dependabot configs missing cooldown settings or with insufficient cooldown periods (default threshold: 7 days), with auto-fix support. StepSecurity offers a GitHub PR check that fails PRs introducing npm packages released within a configurable cooldown period. OpenRewrite has an AddDependabotCooldown recipe for automatically adding cooldown sections to Dependabot config files. For GitHub Actions specifically, pinact added a --min-age flag, and prek (a Rust reimplementation of pre-commit) added --cooldown-days. Gaps Cargo, Go, Bundler, Composer, and pip don’t have native cooldown support yet, which means you’re relying on Dependabot or Renovate to enforce the delay. That covers automated updates, but nothing stops someone from running cargo update or bundle update or go get locally and pulling in a version that’s been on the registry for ten minutes. I couldn’t find any cooldown discussion at all for Maven, Gradle, Swift Package Manager, Dart’s pub, or Elixir’s Hex, if you know of one, let me know and I’ll update this post. The feature also goes by at least ten different configuration names across the tools that do support it (cooldown, minimumReleaseAge, min-release-age, npmMinimalAgeGate, exclude-newer, stabilityDays, uploaded-prior-to, min-age, cooldown-days, minimum-dependency-age), which makes writing about it almost as hard as configuring it across a polyglot project.

Maybe there’s a pattern here?

  1. It occurred to me that if I could invent a machine—a gun—which could by its rapidity of fire, enable one man to do as much battle duty as a hundred, that it would, to a large extent supersede the necessity of large armies, and consequently, exposure to battle and disease [would] be greatly diminished. Richard Gatling (1861)
  2. In 1923, Hermann Oberth published The Rocket to Planetary Spaces, later expanded as Ways to Space Travel. This showed that it was possible to build machines that could leave Earth’s atmosphere and reach orbit. He described the general principles of multiple-stage liquid fueled rockets, solar sails, and even ion drives. He proposed sending humans into space, building space stations and satellites, and travelling to other planets. The idea of space travel became popular in Germany. Swept up by these ideas, in 1927, Johannes Winkler, Max Valier, and Willy Ley formed the Verein für Raumschiffahrt (VfR) (Society for Space Travel) in Breslau (now Wrocław, Poland). This group rapidly grew to several hundred members. Several participated as advisors of Fritz Lang’s The Woman in the Moon, and the VfR even began publishing their own journal.

In 1930, the VfR was granted permission to use an abandoned ammunition dump outside Berlin as a test site and began experimenting with real rockets. Over the next few years, they developed a series of increasingly powerful rockets, first the Mirak line (which flew to a height of 18.3 m), then the Repulsor (>1 km). These people dreamed of space travel, and were building rockets themselves, funded by membership dues and a few donations. You can just do things. However, with the great depression and loss of public interest in rocketry, the VfR faced declining membership and financial problems. In 1932, they approached the army and arranged a demonstration launch. Though it failed, the army nevertheless offered a contract. After a tumultuous internal debate, the VfR rejected the contract. Nevertheless the army hired away several of the most talented members, starting with a brilliant 19-year old named Wernher von Braun. Following Hitler’s rise to power in January 1933, the army made an offer to absorb the entire VfR operation. They would work at modern facilities with ample funding, but under full military control, with all work classified and an explicit focus on weapons rather than space travel. The VfR’s leader, Rudolf Nebel, refused the offer and the VfR continued to decline. Launches ceased. In 1934, the Gestapo finally shut the VfR down, and civilian research on rockets was restricted. Many VfR members followed von Braun to work for the military. Of the founding members, Max Valier was killed in an accident in May 1930. Johannes Winkler joined the SS and spent the war working on liquid-fuel engines for military aircraft. Willy Ley was horrified by the Nazi regime and in 1935 forged some documents and fled to the United States where he was a popular science author, seemingly the only surviving thread of the spirit of Oberth’s 1923 book. By 1944, V-2 rockets were falling on London and Antwerp. 3. North Americans think the Wright Brothers invented the airplane. Much of the world believes that credit belongs to Alberto Santos-Dumont, a Brazilian inventor working in Paris.

Though Santos-Dumont is often presented as an idealistic pacifist, this is hagiographic. In his 1904 book on airships, he suggests warfare as the primary practical use of airships, discussing applications for reconnaissance, destroying submarines, attacking ships, troop supply, and siege operations. As World War I began, he enlisted in the French army (as a chauffeur), but seeing planes used for increasing violence disturbed him. His health declined and he returned to Brazil. His views on military uses of planes seemed to shift. Though planes contributed to the carnage in WWI, he hoped that they might advance peace by keeping European violence from reaching the American continents. Speaking at a conference in the US in late 1915 or early 1916, he suggested: Here in the new world we should all be friends. We should be able, in case of trouble, to intimidate any European power contemplating war against any one of us, not by guns, of which we have so few, but by the strength of our union. […] Only a fleet of great aeroplanes, flying 200 kilometers an hour, could patrol these long coasts. Following the war, he appealed to the League of Nations to ban the use of planes as weapons and even offered a prize of 10,000 francs for whoever wrote the best argument to that effect. When the Brazilian revolution broke out in 1932, he was horrified seeing planes used in fighting near his home, and asked a friend: Why did I make this invention which, instead of contributing to the love between men, turns into a cursed weapon of war? He died shortly thereafter, perhaps by suicide. A hundred years later, banning the use of planes in war is inconceivable. 4. Gunpower was invented many times in history. By the Chinese, yes, but also by the Greeks, the Hindus, the Arabs, the English and the Germans. Humanity had no other explosives until 1847 when Ascanio Sobrero created nitroglycerin by combining nitric and sulphuric acid with a fat extract called glycerin. Sobrero found it too volatile for use as an explosive and turned to medical uses. After a self experiment, he reported that ingesting nitroglycerin led to “a most violent, pulsating headache accompanied by great weakness of the limbs”. (He also killed his dog.) Eventually this led to the use of nitroglycerin for heart disease. Many tried and failed to reliably ignite nitroglycerin. In 1863, Alfred Nobel finally succeeded by placing a tube of gunpowder with a traditional fuse inside of the nitroglycerin. He put on a series of demonstrations blowing up enormous rocks. Certain that these explosives would transform mining and tunneling, he took out patents and started filling orders. The substance remained lethally volatile. There were numerous fatal accidents around the world. In 1867, Nobel discovered that combining nitroglycerin with diatomaceous earth produced a product that was slightly less powerful but vastly safer. His factories of “dynamite” (no relation) were soon producing thousands of tons a year. Nobel sent chemists to California who started manufacturing dynamite in a plant in what is today Golden Gate Park. By 1874 he had founded dynamite companies in more than ten countries and he was enormously rich.

In 1876, Nobel met Bertha Kinsky, who would become Bertha von Suttner, a celebrated peace activist. (And winner of the 1905 Nobel Peace Prize). At their first meeting, she expressed concern about dynamite’s military potential. Nobel shocked her. No, he said, the problem was that dynamite was too weak. Instead, he wished to produce “a substance or invent a machine of such frightful efficacy for wholesale destruction that wars should thereby become altogether impossible”. It’s easy to dismiss this as self-serving. But dynamite was used overwhelmingly for construction and mining. Nobel did not grow rich by selling weapons. He was disturbed by dynamite’s use in Chicago’s 1886 Haymarket bombing. After being repeatedly betrayed and swindled, he seemed to regard the world of money with a kind of disgust. At heart, he seemed to be more inventor than businessman. Still, the common story that Nobel was a closet pacifist is also hagiography. He showed little concern when both sides used dynamite in the 1870-1871 Franco-Prussian war. In his later years, he worked on developing munitions and co-invented cordite, remarking that there were “rather fiendish” but “so interesting as purely theoretical problems”. Simultaneously, he grew interested in peace. He repeatedly suggested that Europe try a sort of one-year cooling off period. He even hired a retired Turkish diplomat as a kind of peace advisor. Eventually, he concluded that peace required an international agreement to act against any aggressor. When Bertha’s 1889 book Lay Down Arms became a rallying cry, Nobel called it a masterpiece. But Nobel was skeptical. He made only small donations to her organization refused to be listed as a sponsor of a pacifist congress. Instead, he continued to believe that peace would come though technological means, namely more powerful weapons. If explosives failed to achieve this, he told a friend, a solution could be found elsewhere: A mere increase in the deadliness of armaments would not bring peace. The difficulty is that the action of explosives is too limited; to overcome this deficiency war must be made as deadly for all the civilians back home as for the troops on the front lines. […] War will instantly stop if the weapon is bacteriology. 5. I’m a soldier who was tested by fate in 1941, in the very first months of that war that was so frightening and fateful for our people. […] On the battlefield, my comrades in arms and I were unable to defend ourselves. There was only one of the legendary Mosin rifles for three soldiers. […] After the war, I worked long and very hard, day and night, labored at the lathe until I created a model with better characteristics. […] But I cannot bear my spiritual agony and the question that repeats itself over and over: If my automatic deprived people of life, am I, Mikhail Kalashnikov, ninety-three years of age, son of a peasant woman, a Christian and of Orthodox faith, guilty of the deaths of people, even if of enemies? For twenty years already, we have been living in a different country. […] But evil is not subsiding. Good and evil live side by side, they conflict, and, what is most frightening, they make peace with each other in people’s hearts. Mikhail Kalashnikov (2012) 6. In 1937 Leo Szilárd fled Nazi Germany, eventually ending up in New York where—with no formal position—he did experiments demonstrating that uranium could likely sustain a chain reaction of neutron emissions. He immediately realized that this meant it might be possible to create nuclear weapons. Horrified by what Hitler might do with such weapons, he enlisted Einstein to write the 1939 Einstein–Szilárd letter which led to the creation of the Manhattan project. Szilárd himself worked for the project at the Metallurgical Laboratory at the University of Chicago. On June 11, 1945, as the bomb approached completion, Szilárd co-signed the Franck report: Nuclear bombs cannot possibly remain a “secret weapon” at the exclusive disposal of this country, for more than a few years. The scientific facts on which their construction is based are well known to scientists of other countries. Unless an effective international control of nuclear explosives is instituted, a race of nuclear armaments is certain to ensue. […] We believe that these considerations make the use of nuclear bombs for an early, unannounced attack against Japan inadvisable. If the United States would be the first to release this new means of indiscriminate destruction upon mankind, she would sacrifice public support throughout the world, precipitate the race of armaments, and prejudice the possibility of reaching an international agreement on the future control of such weapons. On July 16, 1945, the Trinity test achieved the first successful detonation of a nuclear weapon. The next day, he circulated the Szilárd petition: We, the undersigned scientists, have been working in the field of atomic power. Until recently we have had to fear that the United States might be attacked by atomic bombs during this war and that her only defense might lie in a counterattack by the same means. Today, with the defeat of Germany, this danger is averted and we feel impelled to say what follows: The war has to be brought speedily to a successful conclusion and attacks by atomic bombs may very well be an effective method of warfare. We feel, however, that such attacks on Japan could not be justified, at least not unless the terms which will be imposed after the war on Japan were made public in detail and Japan were given an opportunity to surrender. […] The development of atomic power will provide the nations with new means of destruction. The atomic bombs at our disposal represent only the first step in this direction, and there is almost no limit to the destructive power which will become available in the course of their future development. Thus a nation which sets the precedent of using these newly liberated forces of nature for purposes of destruction may have to bear the responsibility of opening the door to an era of devastation on an unimaginable scale. […] In view of the foregoing, we, the undersigned, respectfully petition: first, that you exercise your power as Commander-in-Chief, to rule that the United States shall not resort to the use of atomic bombs in this war unless the terms which will be imposed upon Japan have been made public in detail and Japan knowing these terms has refused to surrender; second, that in such an event the question whether or not to use atomic bombs be decided by you in the light of the consideration presented in this petition as well as all the other moral responsibilities which are involved. President Truman declined to adopt this recommendation.

Apple Announces Updated Studio Display and All-New Studio Display XDR

Apple Newsroom: Apple today announced a new family of displays engineered to pair beautifully with Mac and meet the needs of everyone, from everyday users to the world’s top pros. The new Studio Display features a 12MP Center Stage camera, now with improved image quality and support for Desk View; a studio-quality three-microphone array; and an immersive six-speaker sound system with Spatial Audio. It also now includes powerful Thunderbolt 5 connectivity, providing more downstream connectivity for high-speed accessories or daisy-chaining displays. The all-new Studio Display XDR takes the pro display experience to the next level. Its 27-inch 5K Retina XDR display features an advanced mini-LED backlight with over 2,000 local dimming zones, up to 1000 nits of SDR brightness, and 2000 nits of peak HDR brightness, in addition to a wider color gamut, so content jumps off the screen with breathtaking contrast, vibrancy, and accuracy. With its 120Hz refresh rate, Studio Display XDR is even more responsive to content in motion, and Adaptive Sync dynamically adjusts frame rates for content like video playback or graphically intense games. Studio Display XDR offers the same advanced camera and audio system as Studio Display, as well as Thunderbolt 5 connectivity to simplify pro workflow setups. The new Studio Display with a tilt-adjustable stand starts at $1,599, and Studio Display XDR with a tilt- and height-adjustable stand starts at $3,299. Both are available in standard or nano-texture glass options, and can be pre-ordered starting tomorrow, March 4, with availability beginning Wednesday, March 11. Compared to the first-generation Studio Display (March 2022), the updated model really just has a better camera. (Wouldn’t take much to improve upon the old camera.) The Studio Display XDR is the interesting new one. Apple doesn’t seem to have a “Compare” page for its displays, so the Studio Display Tech Specs and Studio Display XDR Tech Specs pages will have to suffice. The regular Studio Display maxes out at 600 nits, and only supports a refresh rate of 60 Hz. The Studio Display XDR maxes out at 1,000 nits for SDR content and 2,000 nits for HDR, with up to 120 Hz refresh rate. Nice, but not enough to tempt me to upgrade from my current Studio Display with nano-texture, which I never seem to run at maximum brightness. I guess it would be nice to see HDR content, but not nice enough to spend $3,600 to get one with nano-texture. And I don’t think I care about 120 Hz on my Mac? Unresolved is what this means for the Pro Display XDR, which remains unchanged since its debut in 2019. Update: Whoops, apparently this has been resolved. A small-print note on the Newsroom announcement states: Studio Display XDR replaces Pro Display XDR and starts at $3,299 (U.S.) and $3,199 (U.S.) for education.  ★