Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Contrast this to the arguably wasteful CI/CD setups we have now.

To clarify, are you arguing that modern build and release processes should patch binaries in-place, or something else?



As a kind of middle-ground: CI/CD burns our abundant cycles out of band. That’s a good place to spend extravagantly.

I’m all for burning more cycles on tests and static analysis and formal verification etc. before the software goes on the user’s machine.

But we all live every day with “good enough” on our machines every day. I think there’s a general consensus that too much software spends my CPU cycles like a drunken sailor.


It's also the question of what the burned CPU cycles actually buy. Cycles burned on testing buy us less buggy programs. Cycles burned on your CPU that do stuff like out-of-bounds checks or garbage collection also do that.

But most of them are burned on layers upon layers of abstraction that do not actually do anything useful safety- or correctness-wise; they're there solely because we've turned code reuse into a religion. Which wouldn't be bad by itself, if that religion had a firm commandment to not reuse bad solutions - yet that is exactly what we keep doing, again and again, patching it all with the software engineering equivalents of duck tape and glue to keep it all from falling apart. Why is C still at the bottom of virtually every software stack we run? Why do we keep writing desktop apps in HTML/JS? Does a simple app really need 20 dependencies of uncertain quality?

JavaScript is a good example. It's not a bad language because it's high-level - to the contrary, that's the best part of it! It's a bad language because, despite being high-level, it still gives any user ample opportunities to mess things up by accident. We need something just as (if not more) high level but better for general-purpose software development.


I like bitching about JS as much as the next guy, and as someone who has implemented ECMA-262 I guess I’m more entitled than most.

But let’s not get carried away with it. Eich had 9 days from unreasonable manager to shipping physical media. To do kind of a cool Scheme/Self thing from scratch in 9 days? I’ve been on some high-G burns, but that’s fucking nuts.

But since there’s no scientific way to quantify any of this, I’ll throw my anecdote on the scale in favor of my opinion and note that Brendan Eich was a hard-ass Irix hacker at SGI before he followed Jim Clark to Netscape.


I'm not blaming Eich. And I very much doubt that anyone originally involved with that project thought that their tech would be the foundation for large server-side and desktop apps.

But, regardless of how we got here and whose fault it is, we're here - and it's not a good place.


I’m not arguing. Just observing the difference. Different times, different needs and practices.

For example back then it was common to understand the whole machine code of a binary in total. We’re talking no abstraction, no runtimes. Portability and virtual memory were luxuries.

I definitely think CI/CD could be less wasteful, but I don’t necessarily think we should manually patch binaries in place.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: