That was an impressively stupid and/or lazy fuck up, to a point where I think Jones could have a lawsuit against his attorneys there.
IANAL but it does seems like "sending an entire copy of your clients phone and making no effort to redact it" could be a thing that, you know, is bad counsel.
No, the lawyers can argue about the scope of what they show the jury during trial. There are plenty of rules against biasing or inflaming the jury with unrelated material.
I really enjoyed this deep dive on how far video games have come. Also with ~900 views this video is likely to be buried too quickly for the HN crowd to see it otherwise.
yes! couldn't agree more with your long post.
Especially this part:
"(This is exacerbated when components of the automation require internal-only tooling—the poor data scientist now needs to go read through a bunch of half-written, out-of-date documentation about tools they simply don't care about to do a task that is not a core responsibility for them.)"
"Vertical integration" in my experience has just been turning a group of simple tools into a complex monolith that no one understands and is extremely difficult to debug
It's also very easy to complain about how employees have résumé-focus in their approach to their work: “why should I bother to learn some internal-only tooling that I'll never use anywhere else (for a task that I don't really even care that much about…)?”
But, to borrow a line from Warren VanderBurgh's ‘Children of the Magenta’: “(in the industry) we created you like this.”
Another key flaw of precomposed automations for rigidly-defined work-flows is that they usually exist in precisely the circumstances that give rise to their own subversion. (I might even go so far as to suggest that the circumstances are the cause of both the mistake and the maladaptive behaviours that address the mistake…)
Ultimately, deep stacks of tightly-integrated components forming a precomposed automation that enacts some work-flow—“vertical integration” as the post frames it—is obvious enough that it seems every big company tries it… only to fail in basically the same ways every time.
Sorry to pile on but, the google sign on also turned me away. I have a google account and usually use it to sign up for sites I'm interested in but when it's used to gate content that could be accessible without a login I just feel like my data is being harvested.
If you really want to keep the login then maybe a message like we won't spam email you unless asked and only want your to log in to protect against bots would help.
OP here. Appreciate the honesty. I actually agree - I think the same way and I'm realising now that we don't do a good enough job of communicating the need to the user. Thank you for taking the time to comment.
> If you really want to keep the login then maybe a message like we won't spam email you unless asked and only want your to log in to protect against bots would help.
Thanks! You actually can't atm. I've been using it personally for the last 8 months or so and have started to need/want a search function, so I'm going to implement it soon after my current priority which is localisation. Cool idea re embedded search, will look into it!
I think for handling truly/non-affine arbitrary transformations we will have to resort to ML. Then we could have matching very similar to how humans do it (where we really don't care if the transformation is affine/non-affine we just care if it's a huge transformation the makes the image unrecognisable). But I really don't know much about ML.
Okcupid talked about this in an article about image hashing [0] and they have a nice quote:
"The end-to-end approach of training a Convolutional Net to embed images such that the positive / negative case distances separate nicely is probably the cool new hypebeast way forward. This would have the added advantage of being able to handle whatever transformations you can think of at training time."
It would be great if I could remove overlapping triangles then I would have near linear growth of triangles for larger images. But it's very hard to come up with a technique which removes the same overlapping triangles (and leaves the same one) and is invariant to 2D affine transformations.
For example if you have 50 overlapping triangles you have to decide which 49 to remove and you have to remove the same 49 on the query image and the matching image. But because we want to be able to do our searches in sublinear time we can't compare the two images and decided on which triangles should stay/be removed, you have to do all that in preprocessing before inserting into the database/querying. delaunay triangulation looks almost perfect but isn't invariant to 2D affine transformations
reply