Praised by whom? The btwin folder is a low-rent and low quality version of a Dahon-style Chinese fold-in-half folder. That folding design is about 95% of the folding market, and has no clever design features whatsoever. It is simple to manufacture, has no patents, and is pretty bad in general riding use. It is neither fast to fold nor compact, and is very bad in customization, particularly with regard to the rider's reach. And it is really, really boring. Dahon and Tern have some okay bikes of this design, but the entire rest of the design category is dominated by bikes of quite poor quality, including the btwin.
The Decahlon folder folds faster than a Brompton. A friends of mine has both and he's trying to convince me on buying the cheap decathlon bike over a Brompton. The demonstration was convincing.
I'm a big fan of the fact that you can push a Brompton while folded. A feature that other folders miss.
You can push folders that lack the Brompton's rack-rollers design with a little practice. If you watch the Bike Friday video you'll see the guy do it, though it looks like that bike's folding seat post makes it a bit more difficult than on a cheaper folder. I push my folded Zizzo around no problemo.
Praised by most customers, probably. As an engineer I appreciate Bike Friday's attention to detail and I own a good few "artisan" devices myself, but the reality is that most people want a mass-produced bike that is "good enough" within their budget.
There's no doubt that your bike is higher quality than the Decathlon one, but the average customer doesn't appreciate how well engineered it is or how many patents (??) are involved.
Having lived in Italy and used the btwin folder quite a lot, I can assure you there are lots of basic folders in its category and price range which are much better. I'd look into Dahon and Tern for a basic folding bike.
Folding bikes are complex and hard to make safely, and the folding mechanism is costly to engineer right. This means that the manufacturer of a cheap bike is either providing you with a dangerous folding mechanism, or is putting a lot of the cost of the bike into the folding mechanism, so there's not much money left for the rest of the parts. Either way, it means that cheap folding bikes are a bad choice, and the btwin folder is a good example of that.
Some data points: I bought a Decathlon folding bike (Fold 500; new, ~450€) a few months ago. Using it many times a week (probably more than it's intended).
N=2, but my Decathlon bikes have well over 50000 km between them with no issues, beyond the usual wear and tear. Value wise, they are fantastic. They are road bikes, however, not the folding specifically.
This is what a lot of people want to be true, but in fact is not. I believe it can be mostly explained by the "IKEA effect": any given model is bought by so many people that the inevitable design defects are quickly found and remedied.
> Side projects are typically time constrained - if AI saves you time, why wouldn't you use it?
There could be many reasons to not use ai in a case like this, eg: retaining more control, breaking some new ground, because it’s fun, because it’s personal, etc.
It's a blessing and a curse that zero innovation has occurred in the Clojure space since 2016. Pretty sure the only big things has been clojure.spec becoming more mainstream and the introduction of deps.edn to supplant lein. altho I am still partial to lein.
Yes, although if one cares about Jank, they can also use a traditional Common Lisp or Scheme compiler, if compatibility with existing Clojure code isn't a requirement.
> would you consider any of this to be innovative?
JS got optional chaining, nullish coalescing, async/await, decorators, pattern matching proposals - all borrowed from other languages. Python got type hints (borrowed), structural pattern matching (borrowed from ML/Haskell), walrus operator. Rust got async/await (borrowed). Go got generics (very late, borrowed from everywhere).
Almost every "feature addition" in any mainstream language since roughly 2010 is a synthesis or import from prior art - usually from ML, Haskell, Lisp, or Smalltalk lineages. Comparatively, there's been quite some good amount of innovation in Clojure-sphere. Anyone who ever tried Hyperfiddle/electric, generated tests based on Specs or Malli, or even used nbb for scripting - knows that.
So let's either apply pressure everywhere equally, or nowhere. What's your point of singling out Clojure? Are you asking for a higher standard being applied because of Clojure's stated philosophy (simplicity and careful design, etc.), or this is a proxy complaint about something else?
Hmm. I'm not sure what you are looking for — myself, I write software that supports my living, and I'm not looking for thrills. What I get with Clojure is new concepts every couple of years or so, thought through and carefully implemented by people much smarter than me, in a way that doesn't break anything. This lets me concentrate on my work and deliver said software that supports my living. And pay the bills.
Agreed, that is huge for the ecosystem. I have a side project actually that has a unified codebase: central library and api server in clj, and the cli client is babashka.
I know others already pointed out a ton of things, but having worked with Clojure in 2016 and doing active Clojure development for my startup now I feel like I have to chime in too.
In 2016, Clojure was not great for serious data science. That has changed substantially and not just via Java Interop.
- It now has cross ecosystem GPU support via blueberry libraries like neanderthal, which in benchmarking, outperform some serious Java libraries in this space.
- It has columnar indexed JIT optimized data science libraries via cnuernber and techascent part of the Clojure ecosystem. In benchmarking they've outperformed libraries like numpy.
- The ecosystem around data science is also better. The projects aren't siloed like they used to be. The ecosystem is making things interoperate.
- You can now use Python from Clojure via the lib-pythonclj bindings. In general, CFFI is a lot better, not just for Python.
- The linters are way better than they used to be. The REPL support too.
Clojure already had one of the best efficiency scores in terms of code written to what is accomplished, but now you also get REPL integration, and LLMs have been increasingly capable of leveraging that. There are things like yogthos mycelium experiments to take advantage of that with RLLM calls. So its innovating in interesting new ways too, like cutting bugs in LLM generated code.
It just doesn't feel true to me that innovation isn't occurring. Clojure really has this import antigravity feel to it; things other languages would have to do a new release for, are just libraries that you can grab and try out (or maybe that's the python)
> Can you talk more about why you chose CLJ for datascience / ML.
I use Python for a lot of machine learning. My vision transformers, for example, are in Python. There is a lot to like about the Python ecosystem. Throwing away libraries like ablumentations and pytorch because you move to a different ecosystem is a real loss. You probably ought to be using Python if you're doing machine learning of the sort that one immediately thinks of when they see ML.
That said, data science and machine learning are words that cover a lot of ground.
Python often works because it serves as glue code to more optimized libraries. Sometimes, it is annoying to use it as glue code. For example, when you're working on computational game theory problems, the underlying data model tends to be a tree structure and the exploration algorithm explores that tree structure. There is a lot of branching. Vanilla python in such a case is horrifically slow.
I was looking at progress bars in tqdm reporting 10,000 years until the computation was done. I had already reached for numba and done some optimizations. Computational game theory is quite brutal. You're very often reminded that there are less atoms in the universe than objects of interest to correctly calculating what you want to calculate.
Most people use C, C++, and CUDA kernels for the sort of program I was writing. Some people have tried to do things in Python.
> Are there any benefits of using it over Python?
There is an open source implementation of a thing I built. It solves the same problem I solved, but in Python and worse than I solved it and with a lot of missing features. It has a comment in it, discussing that the universe will end before the code would finish, were it to be used at the non-trivial size. The code I wrote worked at the non-trivial size. Clojure, for me, finished. The universe hasn't ended yet. So I can't yet tell you how much faster my code was than the Python code I'm talking about.
> And how is the interop with Python libs?
Worked for me without issue, but I eventually got annoyed that I had to wait for two rounds of dependency resolution in some builds. Conda builds can sometimes have issues with dependency resolution taking an unreasonable amount of time. I was hitting that despite using very few libraries.
To note that enough people have tried to do things in Python, that now writing CUDA kernels in Python is also a supported way, still WIP but NVidia is quite serious about it.
Basically their GPU JIT builds on top of MLIR, thus in the end is no different from anything else on top of LLVM.
I like Clojure and want to get more into, but wondered what folks are doing when it comes to building AI powered apps. So thanks for sharing your experience.
Joke's on you. You seem to be so invested in moving in a single direction that you developed "an expert blind spot". Have you ever thought that it's possible that the knowledge you've so far "accumulated" has become an obstacle to seeing simpler or orthogonal ideas clearly?
Every type system, schema library, and validation tool in every language is in some sense "patching" the lack of built-in guarantees. Haskell's typeclasses patch the lack of ad-hoc polymorphism. Rust's borrow checker patches the lack of memory safety. Python's type hints patch the lack of static types. You can retroactively frame any additive language feature as patching a prior omission - it's not an argument, it's a framing choice.
Spec isn't even so much about patching - it's about runtime generative testing, instrumentation, and data specification in a dynamic context where static types would be the wrong tool anyway. That's a genuine design space with genuine ideas in it, regardless of whether you like dynamic typing. You just can't see it, because you already have decided "isn't innovation, lol", etc.
One more reason to love the language is its community. I appreciate that Clojurians engage with diverse ideas from different tools and languages, freely borrowing the best ones without prejudice, owing to their deep and widespread understanding of language design. And they do it with the focus on pragmatism. Something maybe we can learn from them, even if we don't like the language and tools they make.
> Every type system, schema library, and validation tool in every language is in some sense "patching" the lack of built-in guarantees.
> Spec isn't even so much about patching - it's about runtime generative testing, instrumentation, and data specification in a dynamic context where static types would be the wrong tool anyway.
it's amazing what people can claim when they don't have to prove it. But I wonder, how exactly does your runtime generative tests are different from statically derived strategies that I get via QuickCheck or Validity?
> And they do it with the focus on pragmatism
"pragmatism" is defined in terms of values that one desires to practice. I am in no position to argue that your and their desires don't exist, but please don't claim that their preferences of transducers and schemas are somehow more pragmatic just because they ignored types and effectful/pure evaluation distinction in their language philosophy.
I never claimed that Clojure (or transducers, etc) is "more pragmatic than Haskell", I said "Clojurians engage with diverse ideas pragmatically".
> how exactly does your runtime generative tests are different from statically derived strategies
Spec generators are derived from predicates, not types - which inverts the usual QuickCheck problem where Int generates any Int and you have to write newtypes or custom Gen instances to narrow to "ages 1-120." Spec also has :fn specs that assert relationships between args and return values, which base QuickCheck/Validity don't give you natively (you'd reach for Liquid Haskell). And `instrument` validates real calls in dev, not just sampled properties.
You seem to be operating on a single axiom: types + purity + laziness are the correct solution to the problems worth solving. Given that axiom, every Clojure design choice in your eyes either (a) a patch for not having them, or (b) an unnecessary abstraction that falls out of having them. There is no version of reality in which Clojure can be credited with solving something for you, because the axiom forecloses it.
This is an unfalsifiable position, any additional technical arguments would be wasted. You don't even try to evaluate my counterexamples, because the axiom tells you the counterexamples must be wrong in some way you haven't yet articulated to yourself.
Okay, please, give me Haskell code that takes one composed transformation and applies it, unchanged, to a vector, a lazy seq, a channel, and a pure fold. Not 'here is pipes, here is conduit, here is streaming, here is foldl library' - one piece of code, four consumers. That's the thing you have dodged four times in the other thread.
Clojure didn't ignore static types or the pure/effectful distinction - it made a deliberate decision to optimize for different values. Framing deliberate trade-offs as ignorance is often itself a screaming display of ignorance.
> I said "Clojurians engage with diverse ideas pragmatically".
then you said nothing and contributed nothing to your points, as everybody else "engage with diverse ideas pragmatically". It also so happened that engaging allows for rejection of inferior ideas, which is what transducers are. I can compose around any Python iterable the same way you claim is important to transducers, but do you know what I lose if I engage with the pragmatic Python and Clojure? I lose precision and further optimizations.
> You seem to be operating on a single axiom
How about you abstain from drawing wrong conclusions and actually focus on being precise
> Spec generators are derived from predicates, not types
What do predicates operate on? QuickCheck builds bounded ints within their `minBound` and `maxBound` of the type, as the basis of Int spec deriving. There's no difference and no inversion of intent if your strategy for your newtype actually produces a spec deriving from 1-120 range. If you say there's a thing called Age, and it being a subset of Int or Nat ranges, you do define the Age and its bounds as part of your spec, and there's zero inversion to what clojure spec does. I'm beginning to suspect that I'm conversing with a prompt output.
> Okay, please, give me Haskell code that takes one composed transformation and applies it, unchanged, to a vector, a lazy seq, a channel, and a pure fold. Not 'here is pipes, here is conduit, here is streaming, here is foldl library' - one piece of code, four consumers. That's the thing you have dodged four times in the other thread.
Certainly, I'll do that as soon as you provide me with the example of a transducer tracking effects separately from pure evaluations. We want to be on the same page, don't we? I want to compose my effects without ambiguity, so hurry up.
> Clojure didn't ignore static types or the pure/effectful distinction - it made a deliberate decision to optimize for different values.
lol, it actually ignored it, but you're too perky to simply admit that as if your future depends on it.
> QuickCheck builds bounded ints within their `minBound` and `maxBound`
Yeah, your narrow technical note isn't wrong here (I should've used less trivial example), but the broader differences still hold - spec operating on map shapes without lifting data into types, arg/return relationships without reaching for Liquid Haskell, etc. This is much longer discussion that requires its own thread, unrelated to transducers.
> I can compose around any Python iterable the same way
No, you can't. Python iterables are not uniform across: strict collections, lazy sequences, async channels, arbitrary reducing step functions. itertools composes over iterables. It does not compose over asyncio.Queue, a trio memory channel, or a user-supplied step function. The transducers are specifically about the reducing function as the point of composition, which decouples the transformation from whatever produces or consumes values. Python's iterator protocol is a narrower abstraction. Show me some Python code that applies one composed transformation, unchanged, to an iterable, an asyncio.Queue, and a user-defined reduce function. You can't, because the protocol doesn't support it. Congratulations, now you're complaining on a third language without properly understanding the topic at hand.
> provide me with the example of a transducer tracking effects separately from pure evaluations
Transducers aren't an effect-tracking system because Clojure's language doesn't track effects in types. Asking for a Clojure abstraction that tracks effects is like asking for a Haskell library that works without the type system. It handles effects through different mechanisms, and transducers are orthogonal to effect tracking by design. Effect separation is not a goal of transducers, just as uniform multi-consumer application is not a goal of lazy evaluation.
> lol, it actually ignored it, but you're too perky
This is factually wrong. Hickey's talks (Effective Programs, Maybe Not, Clojure core.typed, the explicit refusal to adopt static types) are public, specific, and reasoned. You can absolutely disagree with the reasoning. Calling deliberate, argued design decisions "ignored" is either ignorance or dishonesty.
You're defending a hierarchy: typed+pure+lazy on top, everything else is a degraded attempt at the top. Within that hierarchy, Clojure can't contribute anything original, by definition - anything Clojure does either (a) duplicates what types+purity already give you, or (b) is a workaround for not having them.
I'm defending something subtler and harder to argue: that different design axes exist, that Clojure's choices are coherent given its axes, and that "this language's solution to X is a workaround for not having Y" is a framing choice, not a technical claim. I am also rhetorically disadvantaged, because "X is just a workaround" is a punchy dismissal and "X is a coherent choice within a different design space" is a paragraph.
I'm not perky about any of it, this isn't a Haskell vs. Clojure debate for specific use cases, you're arguing just for the sake of it. You're not learning, not probing ideas, not stress-testing your own position, nor are you giving me an opportunity for any of that on my side. Language-tribal arguments on HN are a genre, and you're writing in that genre. I hope you had fun, and please don't you dare calling me "a prompt output" - I spent time and energy arguing in vain, about nothing, at least have some human decency to acknowledge that.
Programming language is not just syntax, keywords and standard libraries, but also: processes, best practices and design principles. The latter group I guess is more difficult to learn and harder to forget.
I respectfully completely disagree. not only will you just as easily lose thr processed, best practices and design principles but they will be changing over time (what was best practice when I got my first gig in 1997 is not a best practice today (even just 4-5 years ago not to go all the back to the 90’s)). all that is super easy to both forget and lose unless you live it daily
https://m.youtube.com/watch?v=XX2VSaXmAoo
reply