Hacker Newsnew | past | comments | ask | show | jobs | submit | zephen's commentslogin

I like the idea of being able to merge a PR that is a partial solution, while keeping the issue open to reflect that it is only partially done. It kinda makes sense to do this in a single action.

Also:

> If [a person is not suitable to make the decision of whether the PR should be approved] then the person should remove themselves from the list of reviewers.

This doesn't reflect what sometimes happens in real life. Someone could have sufficient specialized knowledge to be able to veto a PR, without having sufficient broader knowledge to approve a PR. That person should definitely be left on the reviewer list, with the ability to veto, the necessity to remark if he has vetoed or not, and the inability to definitively approve.

It is necessary for this specialist to notate "I have finished examining this PR, and there is nothing within my expertise that would cause me to veto it" before the PR is advanced.

Unfortunately, in a binary system, that often equates to him having to say "I approve" even though this does not truly capture the intent. Then you wind up with hacky work-arounds, like requiring a minimum number of approvals.


> I've never seen a legitimate business not give refunds for technical errors of their own fault.

Granted, it was very much weasel words.

Nonetheless, I read it as they were issuing a refund ("Let me look up your account information to help process your refund request."), but couldn't offer compensation for pain, suffering, loss of use, tracking down the bug, etc.

I could be wrong, of course, precisely because it was (probably AI-generated) weasel words.


> averaging their announced results.

Obligatory XKCD: https://xkcd.com/937/


> In any well run organization you have multiple layers of controls.

Everything depends on size.

A business with 8 employees might need 3 of them to be (literal) keyholders, and might be situated such that any of the keyholders has it in their power to destroy the business.

This is not ideal, obviously, but it is how the world has worked for a very long time, and it is difficult to understand how to make it better in some cases. Modern technology, such as cameras, might help, or might simply help to allocate blame after destruction has occurred.

In any case, this is the background of how people are used to working. We all deal with people who can absolutely destroy us, starting with the cop on the corner.

And we have mechanisms, both before-the-fact, like social coercion, and after-the-fact, like the legal system, to help ensure that this usually works.

LLMs exist in a world where most people are used to extending trust, but it isn't possible for LLMs to conform to the historical expectations that underpin that trust.


I thought it was a cross between a camera and a bomber.


This stupidity might go a long way towards explaining the relentless push towards apps.

Agreed.

And his point about randomly moving buttons to see if people like it better?

No fucking thanks. The last thing I need is an app made of quicksand.


God damn that drives me up a wall! Mozilla is a terrible offender in this regard, but there are myriad others too!

The user interface is your contract with your users: don't break muscle memory! I would ditch FF-derivatives, but I'm held hostage by them because the good privacy browsers are based on FF.

Stop following fads! Be like craigslist: never change, or if you do then think long and hard about not moving things around! Also if you're a web/mobile developer, learn desktopisms! Things don't need to be spaced out like everything is a touch interface. Be dense like IRC and Briar, don't be sparse like default Discord or SimpleX! Also treat your interfaces like a language for interaction, or a sandbox with tools; don't make interfaces that only corral and guide idiots, because a non-idiot may want to use it someday.

I really wish Stallman could be technology czar, with the power to [massively] tax noncompliance to his computing philosophy.


> it’s quite impressive how much faster PUC Lua is than QuickJS and Python

Python's execution time is mostly spent looking up stuff. I don't think lua is quite as dynamic.


Lua is way more dynamic

To illustrate this, here's the contorted Lua code from https://news.ycombinator.com/item?id=11327201

    local t = setmetatable({}, {
      __index = pcall, __newindex = rawset,
      __call = function(t, i) t[i] = 42 end,
    })
    for i=1,100 do assert(t[i] == true and rawget(t, i) == 42) end
Arguably this exercises only the slow paths of the VM.

A more nuanced take is that Lua has many happy fast paths, whereas Python has some unfortunate semantic baggage that complicates those. Another key issue is the over-reliance on C modules with bindings that expose way too many internals.


> A more nuanced take is that Lua has many happy fast paths, whereas Python has some unfortunate semantic baggage that complicates those.

This is a good way to describe it. Most of the semantic baggage doesn't make some speed improvements, up to and including JITing, impossible, but it certainly complicates them.

And of course, any semantic baggage will be useful to someone.

https://xkcd.com/1172/


I suppose it depends on where you are looking for dynamicity. In some ways, lua is much more laissez faire of course.

But in Python, everything is an object, which is why, as I said, it spends much of its time looking things up. And things like bindings for closures are late, so that's more lookups as well.

In lua, many things aren't objects, and, for example, you can add two numbers without looking anything up. Another issue, of course, when you do that, is that you could conceivably overflow an integer, but that can't happen in Python either.

The Python interpreter has some fast paths for specific object types, but it is really limited in the optimizations it can do, because there simply aren't any unboxed types.


Nop, Python is not full object. Not even Ruby is fully object, try `if.class` for example. Self, Smalltalk, Lisp, and Io are fully object in that sense. But none as far as I know can handle something like `(.class`.

Aren't you mixing up syntax and the concepts it expresses? Why would (.class have to be a thing? Is space dot class a thing? I don't think this makes sense and it doesn't inform about languages "being fully object". Such syntax is merely for producing an AST and that alone doesn't mean "object" or "not object". It could just as well be all kinds of different things, or functions, or stack pushes and pops or something.

I think the idea is that SmallTalk replaced conditional syntax with methods on booleans. You could call `ifTrue:` on a boolean, passing it a code block; a true boolean would execute the block, and a false boolean would not. (There was also an `ifFalse:` method.)

This feels more like a party trick than anything. But it does represent a deep commitment to founding the whole language on object orientation, even when it seems silly to folks like me.


Linguistically, it meant your control structures looked the same as native language control structures so there was never any dividing line visually between your code and the system.

It also made it really easy to ingest code, and do meta programming.


>Why would (.class have to be a thing?

It doesn’t have to in the absolute. It just that if some speech seel that a programing language is completely object oriented, it’s fun to check to which point it actually is.

There are many valid reasons why one would not to do that, of course. But if it’s marketed as if implicitly one could expect it should, it seems fair to debunk the myth that it’s actually a fully object language.

>Is space dot class a thing?

Could be, though generally spaces are not considered like terms – but Whitespace shows it’s just about what is conventionally retained.

So, supposing that ` .class` and `.class` express the same value, the most obvious convention that would come to my mind then would be to consider that it’s applied to the implicit narrower "context object" in the current lexical scope.

Raku evaluate `.WHAT` and `(.WHAT)` both as `(Any)` for giving a concrete example of related choice of convention.

>Such syntax is merely for producing an AST and that alone doesn't mean "object" or "not object".

Precisely, if the language is not providing complete reflection facility on every meaningful terms, including syncategorematic ones, then it’s not fully object. Once again, being almost fully object is fine, but it’s not being fully object.

https://en.wikipedia.org/wiki/Syncategorematic_term


You obviously realize that different languages have different syntactic requirements, yet you are willing to cut one language a break when its minimal syntactical elements aren't objects, and refuse to cut other languages a break because they have a few more syntactical elements?

I think you’re describing deficiencies in the Python impl not anything about the language

> I think you’re describing deficiencies in the Python impl not anything about the language

To some extent, sure. And, looking at your implementation of your language, something like the optimizations on passing small numbers of parameters could probably help Python out. It spends an inordinate amount of time packing and unpacking parameter tuples.

But, for example, you can easily create a subclass of an integer and alter a small portion of its behavior, without having to code every single operation, which I don't think you can do in lua.

So, the dynamicity I'm describing is what the language has to do (more work at runtime) to support its own semantics.

Don't get me wrong. There are certainly opportunities to make Python go faster, and the core team is working on some of them (for example, one optimization is similar to your creation of additional subtree nodes for attribute lookup for known cases, but in bytecode instead), but I also think that the semantics of Python make large classes of optimization more difficult than for other languages.

For a major example of this kind of dynamicity, lua doesn't chain metatables when looking up metamethods, but Python will look stuff up in as many tables as you have subclasses, and has the complexity of dealing with MRO. That's not something that couldn't be JITed, but the edge cases of what you need to update if someone decides to add or modify a method in a superclass get pretty hairy pretty quickly.

Whereas, in lua, if you want to modify a metamethod and have it affect a particular object, yes, absolutely, you can do that, but it is up to you to modify the direct metatable of the object, rather than some ancestor, because lua is not going to dynamically follow the chain of references on every lookup.

And, back to to the parameter optimization case, I haven't thought that much about it, but there are a lot of Python edge cases in parameter passing, that might make that difficult.

And, of course, the use of ref counting instead of mark/sweep has a cost, but people don't like, e.g., PyPy, because their __del__ methods aren't guaranteed to be called immediately when the object goes out of scope. Lua is more like PyPy in this respect.

So Python has a lot of legacy decisions that make optimization harder.

Then things that try to be called Python often take shortcuts that make things faster, but don't get any traction, because they aren't 100% compatible.

So cPython is a Schelling point with semantics that are more complicated than some other language Schelling points, with enough momentum that it becomes difficult for other Python implementations to keep up with the standard, while simultaneously having enough inertia to keep people engaged in using it even though the optimizations are coming slowly.

I think the sPy language (discussed here a few weeks ago) has the right idea. "Hey, we're not Python, but if you like Python you might like us." Things that claim to be Python but faster either wither on the vine because of incompatibilities with cPython, or quickly decide they aren't really Python after they've lost their, ahem, Mojo, or both.

(The primary exception to this is microPython, which has a strong following because it literally can go where no other Python can go.)


> This is a really bad idea for reasons already mentioned in other comments.

Agreed, although that doesn't mean it won't be successful.

> Personally, I travel a lot and there's zero chance I would ever take someone's random package.

Me neither, but I'd never cart random strangers around, or let them into my house either when I wasn't there, so I'm not the best judge of these things.

One thing that I will predict is that, if this does, in fact, take off, it will only hasten the enshittification of airline travel. You think you have a hard time trying to find a place to stuff your small carry-on now??!? Just wait. And checked baggage pricing will be through the stratosphere.


And the effect is often multiplicative.

Impersonal corporation which has been improving their capability to make you give up in disgust for decades jumping on the AI bandwagon? Check.

Voice recognition system that doesn't? Check.

Dunning-Kruger level responses once you finally get your voice recognized? Check.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: