Hacker Newsnew | past | comments | ask | show | jobs | submit | instig007's commentslogin

> clojure.spec

Tape-patches for self-inflicted language design issues isn't innovation, lol


> lol

Joke's on you. You seem to be so invested in moving in a single direction that you developed "an expert blind spot". Have you ever thought that it's possible that the knowledge you've so far "accumulated" has become an obstacle to seeing simpler or orthogonal ideas clearly?

Every type system, schema library, and validation tool in every language is in some sense "patching" the lack of built-in guarantees. Haskell's typeclasses patch the lack of ad-hoc polymorphism. Rust's borrow checker patches the lack of memory safety. Python's type hints patch the lack of static types. You can retroactively frame any additive language feature as patching a prior omission - it's not an argument, it's a framing choice.

Spec isn't even so much about patching - it's about runtime generative testing, instrumentation, and data specification in a dynamic context where static types would be the wrong tool anyway. That's a genuine design space with genuine ideas in it, regardless of whether you like dynamic typing. You just can't see it, because you already have decided "isn't innovation, lol", etc.

One more reason to love the language is its community. I appreciate that Clojurians engage with diverse ideas from different tools and languages, freely borrowing the best ones without prejudice, owing to their deep and widespread understanding of language design. And they do it with the focus on pragmatism. Something maybe we can learn from them, even if we don't like the language and tools they make.


> Every type system, schema library, and validation tool in every language is in some sense "patching" the lack of built-in guarantees.

> Spec isn't even so much about patching - it's about runtime generative testing, instrumentation, and data specification in a dynamic context where static types would be the wrong tool anyway.

it's amazing what people can claim when they don't have to prove it. But I wonder, how exactly does your runtime generative tests are different from statically derived strategies that I get via QuickCheck or Validity?

> And they do it with the focus on pragmatism

"pragmatism" is defined in terms of values that one desires to practice. I am in no position to argue that your and their desires don't exist, but please don't claim that their preferences of transducers and schemas are somehow more pragmatic just because they ignored types and effectful/pure evaluation distinction in their language philosophy.


I never claimed that Clojure (or transducers, etc) is "more pragmatic than Haskell", I said "Clojurians engage with diverse ideas pragmatically".

> how exactly does your runtime generative tests are different from statically derived strategies

Spec generators are derived from predicates, not types - which inverts the usual QuickCheck problem where Int generates any Int and you have to write newtypes or custom Gen instances to narrow to "ages 1-120." Spec also has :fn specs that assert relationships between args and return values, which base QuickCheck/Validity don't give you natively (you'd reach for Liquid Haskell). And `instrument` validates real calls in dev, not just sampled properties.

You seem to be operating on a single axiom: types + purity + laziness are the correct solution to the problems worth solving. Given that axiom, every Clojure design choice in your eyes either (a) a patch for not having them, or (b) an unnecessary abstraction that falls out of having them. There is no version of reality in which Clojure can be credited with solving something for you, because the axiom forecloses it.

This is an unfalsifiable position, any additional technical arguments would be wasted. You don't even try to evaluate my counterexamples, because the axiom tells you the counterexamples must be wrong in some way you haven't yet articulated to yourself.

Okay, please, give me Haskell code that takes one composed transformation and applies it, unchanged, to a vector, a lazy seq, a channel, and a pure fold. Not 'here is pipes, here is conduit, here is streaming, here is foldl library' - one piece of code, four consumers. That's the thing you have dodged four times in the other thread.

Clojure didn't ignore static types or the pure/effectful distinction - it made a deliberate decision to optimize for different values. Framing deliberate trade-offs as ignorance is often itself a screaming display of ignorance.


> I said "Clojurians engage with diverse ideas pragmatically".

then you said nothing and contributed nothing to your points, as everybody else "engage with diverse ideas pragmatically". It also so happened that engaging allows for rejection of inferior ideas, which is what transducers are. I can compose around any Python iterable the same way you claim is important to transducers, but do you know what I lose if I engage with the pragmatic Python and Clojure? I lose precision and further optimizations.

> You seem to be operating on a single axiom

How about you abstain from drawing wrong conclusions and actually focus on being precise

> Spec generators are derived from predicates, not types

What do predicates operate on? QuickCheck builds bounded ints within their `minBound` and `maxBound` of the type, as the basis of Int spec deriving. There's no difference and no inversion of intent if your strategy for your newtype actually produces a spec deriving from 1-120 range. If you say there's a thing called Age, and it being a subset of Int or Nat ranges, you do define the Age and its bounds as part of your spec, and there's zero inversion to what clojure spec does. I'm beginning to suspect that I'm conversing with a prompt output.

> Okay, please, give me Haskell code that takes one composed transformation and applies it, unchanged, to a vector, a lazy seq, a channel, and a pure fold. Not 'here is pipes, here is conduit, here is streaming, here is foldl library' - one piece of code, four consumers. That's the thing you have dodged four times in the other thread.

Certainly, I'll do that as soon as you provide me with the example of a transducer tracking effects separately from pure evaluations. We want to be on the same page, don't we? I want to compose my effects without ambiguity, so hurry up.

> Clojure didn't ignore static types or the pure/effectful distinction - it made a deliberate decision to optimize for different values.

lol, it actually ignored it, but you're too perky to simply admit that as if your future depends on it.


> QuickCheck builds bounded ints within their `minBound` and `maxBound`

Yeah, your narrow technical note isn't wrong here (I should've used less trivial example), but the broader differences still hold - spec operating on map shapes without lifting data into types, arg/return relationships without reaching for Liquid Haskell, etc. This is much longer discussion that requires its own thread, unrelated to transducers.

> I can compose around any Python iterable the same way

No, you can't. Python iterables are not uniform across: strict collections, lazy sequences, async channels, arbitrary reducing step functions. itertools composes over iterables. It does not compose over asyncio.Queue, a trio memory channel, or a user-supplied step function. The transducers are specifically about the reducing function as the point of composition, which decouples the transformation from whatever produces or consumes values. Python's iterator protocol is a narrower abstraction. Show me some Python code that applies one composed transformation, unchanged, to an iterable, an asyncio.Queue, and a user-defined reduce function. You can't, because the protocol doesn't support it. Congratulations, now you're complaining on a third language without properly understanding the topic at hand.

> provide me with the example of a transducer tracking effects separately from pure evaluations

Transducers aren't an effect-tracking system because Clojure's language doesn't track effects in types. Asking for a Clojure abstraction that tracks effects is like asking for a Haskell library that works without the type system. It handles effects through different mechanisms, and transducers are orthogonal to effect tracking by design. Effect separation is not a goal of transducers, just as uniform multi-consumer application is not a goal of lazy evaluation.

> lol, it actually ignored it, but you're too perky

This is factually wrong. Hickey's talks (Effective Programs, Maybe Not, Clojure core.typed, the explicit refusal to adopt static types) are public, specific, and reasoned. You can absolutely disagree with the reasoning. Calling deliberate, argued design decisions "ignored" is either ignorance or dishonesty.

You're defending a hierarchy: typed+pure+lazy on top, everything else is a degraded attempt at the top. Within that hierarchy, Clojure can't contribute anything original, by definition - anything Clojure does either (a) duplicates what types+purity already give you, or (b) is a workaround for not having them.

I'm defending something subtler and harder to argue: that different design axes exist, that Clojure's choices are coherent given its axes, and that "this language's solution to X is a workaround for not having Y" is a framing choice, not a technical claim. I am also rhetorically disadvantaged, because "X is just a workaround" is a punchy dismissal and "X is a coherent choice within a different design space" is a paragraph.

I'm not perky about any of it, this isn't a Haskell vs. Clojure debate for specific use cases, you're arguing just for the sake of it. You're not learning, not probing ideas, not stress-testing your own position, nor are you giving me an opportunity for any of that on my side. Language-tribal arguments on HN are a genre, and you're writing in that genre. I hope you had fun, and please don't you dare calling me "a prompt output" - I spent time and energy arguing in vain, about nothing, at least have some human decency to acknowledge that.


You get this for free in Haskell, and you also save on not having to remember useless terminology for something that has no application on their own outside Foldables anyways.

It goes beyond a foldable, can be applied to streams. Clojure had foldables, called reducers, this was generalized further when core.async came along - transducers can be attached to core async channels and also used in places where reducers were used. The terminology is used to document the thing that various contexts accept (chan, into, sequence, eduction etc). They exist to make the language simpler and more general. They could actually allow a bunch of old constructs to be dispensed with but came along too late to build the whole language around.

> It goes beyond a foldable, can be applied to streams.

> Clojure had foldables, called reducers, this was generalized further when core.async came along - transducers can be attached to core async channels and also used in places where reducers were used.

Ok, you mean there's a distinction between foldables and the effectful and/or infinite streams, so there's natural divide between them in terms of interfaces such as (for instance) `Foldable f` and `Stream f e` where `e` is the effect context. It's a fair distinction, however, I guess my overall point is that they all have applicability within the same kind of folding algorithms that don't need a separate notion of "a composing object that's called a transducer" if you hop your Clojure practice onto Haskell runtime where transformations are lazy by default.


>...you also save on not having to remember useless terminology...

It may be true in this particular case, but in my admittedly brief experience using Haskell you absolutely end up having to remember a hell of a lot of useless terminology for incredibly trivial things.


Terminology doesn't bother me nearly as much as people defining custom operators.

I used to think it was cute the you could make custom operators in Haskell but as I've worked more with the language, I wish the community would just accept that "words" are actually a pretty useful tool.


> You get this for free in Haskell,

Oh, my favorite part of the orange site, that's why we come here, that's the 'meat of HN' - language tribalism with a technical veneer. Congratulations, not only you said something as lame as: "French doesn't need the subjunctive mood because German has word order rules that already express uncertainty", but you're also incorrect factually.

Haskell's laziness gives you fusion-like memory behavior on lists for free. But transducers solve a broader problem - portable, composable, context-independent transformations over arbitrary reducing processes - and that you don't get for free in Haskell either.

Transducers exist because Clojure is strict, has a rich collection library, and needed a composable abstraction over reducing processes that works uniformly across collections, channels, streams, and anything else that can be expressed as a step function. They're a solution to a specific problem in a specific context.

Haskell's laziness exists because the language chose non-strict semantics as a foundational design decision, with entirely different consequences - both positive (fusion, elegant expression of infinite structures) and negative (space leaks, reasoning difficulty about resource usage).


> Haskell's laziness gives you fusion-like memory behavior on lists for free.

Haskell laziness & fusion isn't limited to lists, you can fuse any lawful composition of functions applied over data with the required lawful instances used for the said composition. There's no difference to what transducers are designed for.

> But transducers solve a broader problem - portable, composable, context-independent transformations over arbitrary reducing processes - and that you don't get for free in Haskell either.

Transducers don't solve a broader problem, it's the same problem of reducing complexities of your algorithims by eliminating transient data representations. If you think otherwise, I invite you to provide a practical example of the broader scope, especially the part about "context-independent transformations" that would be different to what Haskell provides you without that separate notion.

> and negative (space leaks, reasoning difficulty about resource usage).

which is mostly FUD spread by internet crowd who don't know the basics of call-by-need semantics, such as the places you don't bind your intermediate evaluations at, and what language constructs implicitly force evaluations for you.


> you can fuse any lawful composition of functions

each of those requires manually written rewrite rules or specific library support. It's not a universal property that falls out of laziness - it's careful engineering per data type. Transducers work over any reducing function by construction, not by optimization rules that may or may not fire.

> it's the same problem

It is not. Take a transducer like `(comp (filter odd?) (map inc) (take 5))`. You can apply this to a vector, a lazy seq, a core.async channel, or a custom step function you wrote five minutes ago. The transformation is defined once, independent of source and destination. In Haskell, fusing over a list is one thing. Applying that same composed transformation to a conduit, a streaming pipeline, an io-streams source, and a pure fold requires different code or different typeclass machinery for each. You can absolutely build this abstraction in Haskell (the foldl library gets close), but it's not free - it's a library with design choices, just like transducers are.

You're third claim is basically the "skill issue" defense. Two Haskell Simons - Marlow, and Jones, and also Edward Kmett have all written and spoken about the difficulty of reasoning about space behavior in lazy Haskell. If the people who build the compiler and its core libraries acknowledge it as a real trade-off, dismissing it as FUD from people who "don't know the basics" is not an argument. It's gatekeeping.

Come on, how can you fail to see the difference between: "Haskell can express similar things" with "Haskell gives you this for free"?


Why do you eliminate a library-based solution from the equation if it can actually prove the point that there's no difference in intent as long as my runtime is already lazy by default?

> It is not. Take a transducer like `(comp (filter odd?) (map inc) (take 5))`. You can apply this to a vector, a lazy seq, a core.async channel, or a custom step function you wrote five minutes ago. In Haskell, fusing over a list is one thing. Applying that same composed transformation to a conduit, a streaming pipeline, an io-streams source, and a pure fold requires different code or different typeclass machinery for each.

You can do that only because Clojure doesn't care whether the underlying iterable is to be processed by a side-effectful evaluation. That doesn't negate the fact that the underlying evaluation has a useless notion of "transducer". I said "fuse" in my previous comment to demonstrate that further comptime optimisations are possible that eliminate some transient steps altogether. If you don't need that you can just rely on generic lazy composition of functions that you define once over type classes' constraints.

`IsList` + `OverloadedLists` already exist. Had Haskell had a single type class for all iterable implicitly side-effectful data, you would have got the same singly-written algorithm without a single notion of a transducer. Let that sink in: it's not the transducer that's useful, it the differentiation between pure and side-effectful evaluations that allow your compiler to perform even better optimisations with out-of-order evaluations of pure stuff, as well as eliminating parts of inner steps within the composed step function, as opposed to focusing just on the reducing step-function during the composition. It's not a useful abstraction to have if you care about better precision and advanced optimisations coming from the ability to distinguish pure stuff from non-pure stuff.

Haskell aside, if your goal is to just compose reusable algorithms, a call-by-need runtime + currying + pointfree notation get you covered, you don't need a notion of transducers that exist on their own (outside of the notion of foldable interfaces) to be able to claim exactly the same benefits.

> Two Haskell Simons - Marlow, and Jones, and also Edward Kmett have all written and spoken about the difficulty of reasoning about space behavior in lazy Haskell.

There's a difference between what the people said in the past, and the things the crowd claims the people meant about laziness and space leaks. We can go over individual statements and see if they hold the same "negative" meaning that you say is there.


On IsList + OverloadedLists - this is a fantasy counterargument. Unified typeclass for side-effectful iterables doesn't exist in Haskell, so you're saying "had it existed, you'd get the same thing", you're describing a different language.

Transducers don't exist despite the lack of purity distinction, they exist because the reducing step function abstraction is useful regardless. You've drifted from "you get this for free" to "a different design with different trade-offs would make this unnecessary" - which again is just describing a different language. You're moving goalposts from "you get this for free" to "you don't need this at all if you design your language differently..."

Look, there are nice things in Haskell for sure, there are things that may cause frustration as well. Same for Clojure, but comparing them on a single thing is like judging a bicycle and a boat by which one flies better. They're built on fundamentally different assumptions and those assumptions cascade into every design decision. Transducers aren't a workaround for the absence of laziness, they're a natural solution within Clojure's actual constraints and goals - you're complaining without even understanding those constraints (in both of them). Haskell's laziness isn't a superior version of transducers, it's a different bet on different trade-offs. Neither language is trying to be the other.

Stick to Haskell if you must, bring to the table some interesting ideas from it, they'd be appreciated, but please stop spreading confusion and misinformation, thinking that if you talk louder people would prefer Haskell. It's not like folks en masse trying to abandon Python, Typescript and Java and confused between choosing Clojure or Haskell.


> so you're saying "had it existed, you'd get the same thing", you're describing a different language.

that's not what I'm saying. I'm saying that Haskell doesn't have it because it's a useless and shallow abstraction to have, that also hampers the ability to apply advanced optimisation laws down the compilation pipeline.

I will just repost the part that you conveniently ignored in your reply and pretended that it didn't exist:

Let that sink in: it's not the transducer that's useful, it the differentiation between pure and side-effectful evaluations that allow your compiler to perform even better optimisations with out-of-order evaluations of pure stuff, as well as eliminating parts of inner steps within the composed step function, as opposed to focusing just on the reducing step-function during the composition. It's not a useful abstraction to have if you care about better precision and advanced optimisations coming from the ability to distinguish pure stuff from non-pure stuff.

My argument holds: you get the same composability with lazy functions for free, you don't need to apply rewrite rules to be on the same level of reusability. Haskell grants you that for free, but for some reason you chime in and claim that's not the case and the only proof you've provided had to do with missing interfaces that can be solved by a library implementation. There's no restriction in the type system, nor runtime, to have it. But people don't need it because it's a useless abstraction that doesn't improve the baseline of what Haskell has to offer both in terms of composability of your foldings and further optimisations that take iteration purity into account.

> You've drifted from

I didn't drift from anything, I told you that you ignored a library-based solution in a sneaky attempt to move the goalpost from "you need rewrite rules in many places" to "there's no interface generic enough to accomodate effectful and non-effectful steps together without a library implementation".

> Haskell's laziness isn't a superior version of transducers

It absolutely is a superior solution to the same problem of algorithm optimisation and composability. It's more generic, it applies to anamorphisms and hylomorphisms in the same way as it does to foldings, and it doesn't introduce a special terminology to a single building block that doesn't exist outside foldings anyways.

> but please stop spreading confusion and misinformation

that's a bold statement coming from someone that claims that call-by-need semantics in Haskell is a negative aspect of the language according to other people (who probably didn't mean it in the first place, but you wouldn't dare to verify).


> It's more generic

Generic over what? Lazy evaluation is a semantic property of expression reduction. Transducers are parameterized over the reducing function. These aren't comparable on a generality axis - they live at different levels of abstraction. The fact that recursion schemes (ana/hylo) exist in Haskell is true and cool but doesn't address the actual transducer claim, which is: one value, applied to fundamentally different consumers (a channel, a fold, a stream, a transient collection) without recompilation or re-specialization. In Haskell, the closest analogs are conduit/pipes/streaming - each a library, each with its own type, each requiring adapters between them.

The concrete example - `(comp (filter odd?) (map inc) (take 5))` applied across source types - is the single most load-bearing thing in the thread and you never actually answered it. You gestured at OverloadedLists + a hypothetical unified typeclass, then pivoted to "it's useless anyway" which is the tell that you don't even understand the topic to start contemplating a direct answer.

Can we we please stop responding to concrete technical points by retreating to broader aesthetic claims - "useless", "shallow", "superior"? This honestly isn't helping anyone. I don't see the point of keeping going here, and not because I'm from the "internet crowd who don't know the basics".

You're claiming to know how (a better) language should have been designed, okay, let's talk about possibilities, instead of "just use Haskell" - that is really is childish.


> Generic over what?

Generic over whatever you decide to compose out of smaller parts into a full algorithm that doesn't produce transient buffered results. Transducers is a dead end of abstractions, they aren't applicable anywhere but folding, lazy runtime gets you covered for free regardless of your choice of the exact source of either a foldable `Stream f e a`, or a generator of values on demand, or even a data constructor.

> doesn't address the actual transducer claim, which is: one value, applied to fundamentally different consumers (a channel, a fold, a stream, a transient collection) without recompilation or re-specialization.

the transducer claim is that there's no way to track effects, period. Hey, you've found a new abstraction that doesn't care about things, my congratulations, you're now on par with Python itertools!

> In Haskell, the closest analogs are conduit/pipes/streaming - each a library, each with its own type, each requiring adapters between them.

Do you understand why it's the case? It's because transducers are useless and people actually care about further optimisations and experimentation. To be on par with Clojure it would be enough to have a single `Stream m e a` that everyone would silently buy into. But no one opts for it, because people actually care about their local optimisations that go beyond what you think transducers give you. If you don't care about those, pick any generic enough interface and glue it with whatever you want in a single place for the entirey of your ecosystem. Had `Streamly` been part of `base`, you'd get exactly that property that you claim isn't a thing. Then maybe add `streamly` into your dependency list and start using it pervasively everywhere where iteration happens. You'll be on par with Clojure, but without the silly notion of transducers as a thing of its own (but it's not, it's only for foldings that don't care about side-effects).


> Transducers aren't applicable anywhere but folding

Wrong, factually wrong! Transducers apply to anything expressible as a step function: reductions, yes, but also channels (core.async), observable streams (manifold), eduction pipelines, into-transformations, transient-collection builds, stateful transformations like partition-by and dedupe that don't fit a pure fold at all. (dedupe) is a transducer. Try expressing it as a pure lazy-list fusion. You can, but you need explicit state threading, and then you've rebuilt a step function by hand.

The definition of "folding" you're using here is so broad it's doing no work. If "folding" means "any left-to-right consumption of values", then yes, transducers are for folding - and so is ~all of streaming, ~all of iteration, ~all of channel consumption. You're using the word to make the scope sound small while the scope is actually most of what programs do with sequences of values.

> the transducer claim is that there's no way to track effects, period.

The transducer claim - the actual one, as stated in Hickey's talk and the docs - is that a reducing function is a fundamental substrate that composes, and you can build transformations over reducing functions that are source and sink-agnostic. Effect tracking is orthogonal.

You keep trying to move the goalposts to "transducers must track effects or they're useless". That's like saying "typeclasses must handle concurrency or they're useless" It's a category demand imported from your preferred language's feature set.

The "on par with Python itertools" jab is wrong. itertools composes over iterables only. Transducers compose over reducing functions. Python itertools does not work against asyncio.Queue or a user-defined reduce.

> To be on par with Clojure it would be enough to have a single `Stream m e a` that everyone would silently buy into.

Okay, let me read that again slowly:

1. The property you're describing (one transformation, many consumers) is real and distinct.

2. Haskell does not currently give it to you (on the language level).

2. To give it to you, Haskell would need a single blessed streaming abstraction in base.

3. Haskell doesn't have one because the community prefers local optimization over a universal substrate.

The rationalization is fine - yes, there's a real trade-off between "single blessed abstraction for everyone" and "multiple specialized libs, each optimized for its niche" - but it is a trade-off. You've started with "you get this for free in Haskell", and arrived to: "Haskell correctly chose not to give you this, and here's why the thing you want is actually bad..."

Streamly is a great Haskell library, does streaming well, has effect tracking, is performant. And it is absolutely not a drop-in transducer analog - Streamly composes over Streamly streams. If you have a conduit source, a pipes producer, and a streaming Stream, Streamly doesn't make one composed transformation apply to all three. It just adds a fourth ecosystem. So your "had Streamly been in base" hypothetical is exactly the Clojure move - pick one substrate, bless it, get uniformity - and now you're simultaneously using Streamly to argue that Haskell doesn't need transducers while pointing at a hypothetical world where Haskell would have done what Clojure actually did. "pick any generic enough interface and glue it with whatever you want in a single place for the entirety of your ecosystem" - this is basically what Clojure did.

Can we just find a middle ground in this debate that maybe actually works, something like:

"Sure, Clojure blessed a universal reducing substrate at the language level. Haskell didn't, and instead has multiple streaming libraries, each with stronger local guarantees about effects, memory, and back-pressure. Clojure trades uniformity across effect context - a transducer, works everywhere, at the cost of the compiler not telling you whether a given pipeline touches the world; Haskell chose effect-visibility in types"

Neither side is free. Clojure pays in runtime-only knowledge of effects. Haskell pays in fragmentation of streaming abstractions and the attendant ceremony of moving between them. That's the trade. It's not flaws being papered over; it's the shape of the bet. You can argue the bet is wrong, but you can't argue it wasn't made on purpose.


As everybody knows, key strokes and mouse movements are the things that solve problems, definitely the data worth capturing for AI training.

Maybe they're building a simulation of the rich lives and behaviors of white collar office people in the early 21th century, with breathtaking detail?

I couldn't imagine life without my unique keystrokes and mouse movements.


Like a museum exhibit?

But you can put on 3D goggles, maybe there's even TTS narration.

Some call it museumverse.


As everybody knows, key strokes and mouse movements are the things that solve problems, definitely the data worth capturing for AI training.

See: https://si.inc/posts/fdm1/

If they captured display output as well, it could be a very useful dataset for generalized computer use.


They used to say the same thing about text, it turned out that after all training the best thing they could achieve is the `ccc` compiler.

> Token density

If you really cared about that, you wouldn't have picked Rust. Nim or Haskell are terser languages.

> Every language we’ve built defaults to sequential execution with parallelism bolted on.

False and misleading statement. Array languages have been around for a long time, but you didn't care to learn them.

> Formal verifiability. Move beyond type checking to compile-time proofs.

Good luck with your tokens budget for proof providing. LLMs won't solve that for you. If you believe that proofs are as simple as matching API calling inerfaces you're wrong.

> Declared effects. Every function explicitly states what it reads, writes, and depends on, machine-enforced.

Good functions are pure functions that have no effects. Good design tries to minimize the number of effects needed and maximize the footprint of pure functions mapped over inputs and outputs. If you insist that every function needs an explicit effect annotation, you don't know the topic and you haven't worked with effect systems much.


Just juggling with balls in the air gets boring very quickly, and the added numbers don't make it much different. Learning statics and flows from contact juggling, but performing them with standard juggling balls is so much more fun. And then you discover statics with hoops: https://www.youtube.com/watch?v=PF6UuPsw2i4


Re. "CAD/BIM", technically speaking CAD doesn't imply BIM, and the industry's promotion of BIM is akin to AI promotion among software engineering teams - the benefits aren't clear upon detailed review of the advertised capabilities. The CAD part, on the other hand, is generally recognized as the essential tooling for the profession and I'm surprised to hear that it just is a "wonderful aspiration".


"The profession" actually is a wide variety of trades, not just architects and contractors. Electricians, plumbers etc. where CAD is not yet widely spread. Which hopefully will change in the near future, with open source BIM tool chains, boosted by generative/agentic AI.. Finally, a huge source of confusion and execution hiccups will be overcome.


Until then pdf rules!!!


Generalizing with "everything", "all", etc exclusive markers is exactly the kind of black/white divide you're arguing against. What happened to your nuanced reality within a single sentence? Not everything is black and white, but some situations are.


The person he's replying to argued against putting things on a spectrum. Does that not imply painting everything in black and white? Thus his response seems perfectly sensible to me.


He argued against putting things in a spectrum in many instances where that would be wrong, including the case under the question. What's your argument against that idea? LLM'ed too much lately?


He argued against and the response presented a counterargument. Both were based around social costs and used the same wording (ie "everything").

You made a specious dismissal. Now you're making personal attacks. Perhaps it's actually you who is having difficulty reasoning properly here?


I find the other article that the author refers to in his text, to be more thorough and revealing: https://www.wheresyoured.at/the-ai-industry-is-lying-to-you/


Speaking as the author of the original article, I wholeheartedly agree! :D


The industry will adapt quickly, especially the part that's using multiplatform mainstream engines like UE/Unity.

Lots of new/recent native MacOS releases nowadays: https://store.steampowered.com/macos


The same that support Linux and yet Valve has to come up with Proton.


Developers chase the user base. If and when the users choose Linux developers will target Linux.

Proton as a project let's valve hedge on the heir apparent OS without upfront developer cost. If the Linux player base grows, developers will follow and valve is poised to remain dominant.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: