> I have a Mac, not an XDG desktop. I would only expect and want X applications I run through Xquartz (all zero of them) to follow that.
XDG has nothing to do with X11. XDG stands for "Cross-Desktop Group," and is designed specifically for any Unix or Unix-like operating system, which includes macOS.
XDG stands for X Desktop Group. It absolutely does not stand for Cross Desktop Group and has nothing to do with macOS or Windows, outside of aforementioned X apps on Quartz via XQuartz which as far as I know is completely dead.
the successor to xdg, freedesktop.org, however is acknowledging the need for cross platform openness. that's exactly why you indeed can configure where the three main "stores" of compliant applications, their config, their data and their caches.
you can point them to %APPDATA%..., ~/Library or the Linux defaults.
my point in this is: there are free and open conventions and we wouldn't need this "my HOME is cluttered" fuss, if technical teams would embrace them.
so why don't they respect XDG_ env vars for their config and data?
I'm a lifelong musician, went to music school to study jazz and orchestration, was a professional film composer for 15 years prior to pivoting to programming. I've read quite a few books on the intersection of math and music.
And not once have I ever felt that these so-called intersections were anything other than contrived.
Of course we can interface with music from a mathematical perspective, but that doesn't mean that we should or that there's anything particularly illuminating to gleen from doing so.
Beyond the very basic math (honestly even that's perhaps too strong a word -- just because something is expressed in numbers doesn't make it _math_) of time signatures and some harmonic concepts up to maybe some of Slonimsky's work, doing so is IMO a fool's errand that exists only to fill space on a TEDx stage.
But here's the thing: learning Android dev is nothing like "learning" to use an LLM.
Obviously there are tons of tools and systems building up around LLMs, and I don't intend to minimize that, but at the end of the day, an LLM is more analogous to a tool such as an IDE than a programming language. And I've never seen a job posting that dictated one must have X number of years in Y IDE; if they exist, they're rare, and it's hardly a massive hill to climb.
Sure, there's a continuum with regards to the difficulty of picking up a tool, e.g. learning a new editor is probably easier than learning, say, git. But learning git still has nothing on learning a whole tech stack.
I was very against LLM-assisted programming, but over time my position has softened, and Claude Code has become a regular part of my workflow. I've begun expanding out into the ancilary tools that interact with LLMs, and it's...not at all difficult to pick up. It's nothing like, say, learning iOS development. It's more like learning how to configure Neovim.
In fact, isn't this precisely one of the primary value propositions of LLMs -- that non-technical people can pick up these tools with ease and start doing technical work that they don't understand? If non-technical folks can pick up Claude Code, why would it be even _kind_ of difficult for a developer to?
So, I'm with the post author here: what is there to get left behind _from_?
Not quite on topic but as an engineering manager responsible for IDE development, explaining to recruiters and candidates I wanted engineers who developed IDEs, not just used them. Unfortunately, that message couldn't get through so I saw many resumes claiming, say 5 years of Eclpse experience, but I would later determine they knew nothing of the internals of an IDE.
Presumably, people now claim 3 years of machine learning experience but via ChatGPT prompting.
My theory is that they're going to release a new Mac Pro that's about half the size of the current one. Enough space for some PCIe slots, but otherwise smaller given the enormous amount of wasted space in that thing since moving from Intel to Apple Silicon. Guessing the rack-mount model, should they continue selling it, will be 3 or 4u instead of 5u.
I know everyone thinks they're going to just kill it, but I don't see it. Apple's move under Tim Cook has been to exhaust supplies (see: filling the Intel Mac Pro chassis with air and not updating the CPU), letting people predict its death (see: 2013 -> 2019 Mac Pro silence), and then redesigning it into something people want while utilizing it as an opportunity to segment specs across their SKUs.
The Studio will remain the high-powered creator machine, whereas the Mac Pro will be retooled into an AI beast.
Why people buy the Studio with the high ram config is actually the unified memory. This is unique to Apple. I'm not sure what Mac Pro would do with PCIe cards . It would be useless for AI because what you want is unified memory that can be used by the GPU/AI not just ram.
Its not entirely unique to Apple: the Ryzen AI Max platform (in the e.g. Framework Desktop) is a unified memory platform. The PlayStation 5 also has a unified memory architecture (which given the chiplet was made by AMD, not too surprising) (people sleep on PlayStation hardware engineering; they're far better at skating to where the puck is headed than most hardware tech companies. remember Cell?)
> I'm not sure what Mac Pro would do with PCIe cards .
Video and Audio Engineers [1] would like to have a word. Not to mention PCIe Network Card. And they do use all the slot in the Cheese Gater although I believe a modern version could have cut those in half.
PCIe cards would indeed be useless for AI unless Apple supports third-party GPUs, but there are certainly some pro creators that would still prefer to have them. I myself work in large-template film/game scoring and while we all love our Mac Studios, they're usually housed in a Sonnet chassis so that we can continue to use PCIe cards. Had Apple kept them in parity with the Studio w/r/t CPU and RAM, the rack-mount version of the Pro would've been a no-brainer.
It is already a walking zombie, Apple clearly no longer cares about the workstation market, regardless of how many "I still believe" t-shirts get sold to wear at WWDC.
The article specifically talks about B2B and MDM-like features. The "average consumer" isn't the point here -- rather, governments, defense, high-security corporations, etc.
I always try and keep in mind that we typically think of software as having three versions -- alpha, beta, and release -- but for it's considered even kind of "finished."
In my own work, this often looks like writing the quick and dirty version (alpha), then polishing it (beta), then rewrite it from scratch with all the knowledge you gained along the way.
The trick is to not get caught up on the beta. It's all too tempting to chase perfection too early.
XDG has nothing to do with X11. XDG stands for "Cross-Desktop Group," and is designed specifically for any Unix or Unix-like operating system, which includes macOS.
reply