I’d think the lesson here is obvious, but maybe not.
If you thought this project had value, you could’ve contributed to it. You probably still could.
Or, if you think its value is worth $0 (to you), maybe it’s not really that sad (to you).
People are expressing sadness as if there was nothing to be done about it, but, of course, there’s a really straight-forward thing that could’ve been done about it (possibly still could).
I don’t think the previous poster is saying all layoffs are “cowardly”, but pointing out that these ones are.
I think they have a point. Facebook is making money. Tech is in a very dynamic phase, right now. This is a moment of huge opportunity for them, and one that won’t necessarily be as large in the future.
To be contracting right now, rather than making a play, seems like a lack of leadership.
yeah, these big layoffs don't add up to me right now.
if you're making money and you feel that these are good employees, why not take them off the core products and ship them to some other ambituous R&D proejct?
making core products leaner is probably a good, but surely there's some other big moonshot you'd like to take?
The comment asks what’s the grift? How is he irrelevant?
You ignore the questions and respond with ad hominem attacks.
Obviously, you’ve got a beef with Gruber. That’s fine. But you’re acquitting yourself well here (and make us suspect that whatever happened between you and Gruber, you might have had a significant hand in it).
As I said, this is not something I will discuss publicly. But what I can say is that Apple wouldn't be the first tech company to pay for certain stories to be written, as I'm sure you're aware. There was a time when Apple needed John (along with his apparent "unbiased" takes). John's blog was critical for a period of time.
I found this exchange both entertaining and informative. Appreciate you sharing an insider's perspective (while also acknowledging I have no possible way to verify if any of this is even true).
Heh... thanks. I don't expect anyone to just believe this information verbatim; as you said, I'm just some rando on HN. But I did offer to discuss it privately with @simonw.
I like to imagine they’ll mostly capture meta employees using AIs to do work.
Then they’ll deploy models trained on this, and begin capturing employees using AIs that are good at using AIs to do work.
Repeat a few times and they’ll start capturing the keystrokes from people mashing their heads into keyboards with dispair and exclaiming, “Why can’t these models do anything anymore!!”
I am to speculate that they are going to use this as an excuse to let people go without doing mass layoffs and having to pay severance. Training AI is just an excuse.
Many many moons ago I refused to implement a calendar event scraping system at Meta where it would look at all of your meetings on the calendar and do "analysis". IDK what ever happened to that task, I assume it died a death of no one else being willing to do it. This was probably 2011 or so, I can only imagine it has gotten so much worse.
It's pretty easy to scrape your own calendar events in Meta. I'm not sure about others' as I'm not a manager, but I wouldn't be surprised if it were visible as long as someone is in your report chain.
It's been a while so I don't remember the specifics, but the I think the request came from someone working at Accenture from one of the FinTech teams? At the time the proposal had both technical limitations and performance implications on top of the privacy issues. To do it right would have meant standing up a new set of client exchange endpoints and permissions that were far too permissive so the task just got ignored for a long time and I think that manager behind it went away eventually or they settled for public data inside the company.
The only report chain based permissions were around distribution lists which were just some powershell scripts that walked AD every night. These also got used for security groups to gate access to some things. Be default, calendars were visible to all authenticated users unless you made them private or individual events private. The meeting tool leveraged this for example.
I was working on corp email (among other things) there from 2009 to 2016, so I can't speak for now.
White collar firms with a reputation for paying well don’t cheap out on severance. It’s a cheap way to get employees to sign some stuff reducing the risk of lawsuits, plus their unemployment insurance premiums stay lower.
It’s only once the business is having a cash crunch or will no longer need to hire competitive candidates that they start letting people go without severance.
While it would be a hilarious failure mode to encounter, this is actually a good thing!
These models already have the skills that humans were using them for, so either by training the models to use subagents or simply inlining the work done by the AI, you have a much easier time training the model to perform tasks from a human-distribution. The humans have done the work of making the human-distribution look more like an AI distribution.
Not when all of the marketing of LLMs is touting their abilities to do the exact thing and that is what investors are being presented.
If it is as you say, then eventually the house of cards will crumble. Then we can finally go back to work and quit being inundated with needing to use AI for everything.
Not sure Claude Design really competes with Figma.
While it has a strong potential to let people iterate on using a design without the nuts and bolts of going back and forth with a designer, CD operates at the "leaf-node" level, where the output is generated.
However, a lot of design has a deeper life-cycle than that. There's the collaboration, pitching, review, iteration, asset management, etc.
In fact, the first step for using CD is "onboarding", where it sucks up a design system from your existing assets/resources. It presumes you already have a design.
As it stands now, CD is one way... existing design -> task specific resources. This could be very useful, but only touches on a part of what a complete design tool does. But for iteration it's not so great. E.g. task specific concerns don't have a way to feed back to the originating design. Changes to the originating design don't have a direct path to feed back to the task specific output (e.g., when a logo or branding focus changes, or maybe just spacing guidelines are updated, the ad hoc processes around CD will have to be repeated if the changes are to actually land.)
I'd think AI design integrated with Figma is in a much better position to address these more complicated scenarios.
I doubt Claude Design even cares about these deeper scenarios, BTW -- it's intended as a leaf-node tool. Just pointing out it's not about to replace Figma or other more comprehensive design tools.
>However, a lot of design has a deeper life-cycle than that. There's the collaboration, pitching, review, iteration, asset management, etc.
If corners can be cut, they will. All those steps would be flatened to something like CD and a couple of side tools.
Companies did "collaboration, pitching, review, iteration" because they had a designer in the loop anyway for the actual final work. Now that they don't have to, how many will just skip those steps, and if it means the end product gets less intented and "defined", they'd be fine with that?
Agreed. I also think the collaboration, pitching, review bits have been heavily design theater for awhile. I'm not saying it was the designer carrying on the charade, but the product team generally. Those steps all really happen only for the final implementation to be a frankensteined fraction of what was discussed. I'm not saying anything remotely like we should be more respectful of the designer's effort, I'm saying there's so much wasted and unused design work. I'm saying you could cut that out of the process and you'd get a very similar end result. That end result might be bad (perhaps it would help to be more respectful of the designer's efforts), but it's the same either way.
The requirements are so unstable—the product team has few strong beliefs—that they change the next day. And then again every few days after. Hopefully, the changes are small enough that design isn't full resetting each time, but it's not rare to have big changes. The entire project gets swapped not infrequently. What eventually slows the changes is the engineering deadline and the fact that the developers need to start. But the slow drip of product requirements means whatever time budgeting went to design shrinks. And whatever time went to engineering is eaten into such that now the design needs to be something that can be built in half the original amount of dev time. Each day the designer takes at this point eats into that window and so it's dictated by what can get built.
I don't think that has to strictly be viewed like an entirely bad outcome, but for what it is and how it's accomplished, you could just cut the design part out. Besides, you're going to iterate later, right? Right?
>I also think the collaboration, pitching, review bits have been heavily design theater for awhile. I'm not saying it was the designer carrying on the charade, but the product team generally. Those steps all really happen only for the final implementation to be a frankensteined fraction of what was discussed.
Absolutely. A chance for middle management and C-levels to bikeshed inconsequential bullshit and feel like they're doing something.
>That end result might be bad (perhaps it would help to be more respectful of the designer's efforts), but it's the same either way.
Can't be that worse than the slow to load, 50MB for a page, flat design full of wasted space shit redesigned every year or so to follow the new stupid trends that we're getting for the past 15 years
Exactly the same thing as coding, where “we don’t want to lower the quality bar” platitudes are repeated, while in actual fact that’s exactly what they want from us with AI output, consequences be damned. The stock market will reward us for the short term play.
I agree (at least for now) that Claude Design doesn't directly compete with the core Figma tool. It does directly compete with Figma Make - which is also an LLM-powered tool that generates HTML/CSS/JS output (not a canvas of components, like Figma's core product).
I do think Figma will have a problem that people with think Claude Design competes with Figma directly.
I expect people in leadership positions aren't comparing "Claude Design vs. Figma", but area comparing "Me and my product manager using Claude Design vs. A designer using Figma."
WRT the west coast, mostly. It's about as long as Japan, but only about half the population. It's certainly populated enough that it's not justifiable that rail travel is so slow.
Less so for the east coast though. From roughly DC to Boston is decently connected with rail, but is not nearly as direct of a corridor as Japan.
Cars were already popular in the US and good enough of a solution in conjunction with the highway system, maybe. If basic transportation is solved, it probably reduces the impetus to build passenger rail for rail's sake.
Really? Take it all the time going to NYC even though it's not really very convenient for me to get to a northern station. Amtrak is priced to make it a good idea to book tickets in advance. Shinkansen isn't cheap either, especially if you don't have a pass--not sure of current details.
It's true to some degree now. But it wasn't very true -- or expected to be true -- back when train lines were being established. That was during westward expansion.
I'm very aware! I live in NYC and have taken many trains up/down the corridor. But it still pales in comparison to the experience I get in Japan (which is cheaper, nicer, faster, more frequent, often more direct, connects up better to local transit within cities, etc.)
> Binary search beats SQLite... For a pure ID lookup, you're paying for machinery you're not using.
You'll likely end up quite a chump if you follow this logic.
sqlite has pretty strong durability and consistency mechanism that their toy disk binary search doesn't have.
(And it is just a toy. It waves away the maintenance of the index, for god's sake, which is almost the entire issue with indexes!)
Typically, people need to change things over time as well, without losing all their data, so backwards compatibility and other aspects of flexibility that sqlite has are likely to matter too.
I think once you move beyond a single file read/written atomically, you might as well go straight to sqlite (or other db) rather than write your own really crappy db.
If you thought this project had value, you could’ve contributed to it. You probably still could.
Or, if you think its value is worth $0 (to you), maybe it’s not really that sad (to you).
People are expressing sadness as if there was nothing to be done about it, but, of course, there’s a really straight-forward thing that could’ve been done about it (possibly still could).
reply