Hacker Newsnew | past | comments | ask | show | jobs | submit | xer's commentslogin

Our of $60B, what does that make VSCode priced at?

The fundamental assumptions of distributed systems is having multiple machines that fail independently, communicate over unreliable networks and have no shared clock has the consequence of needing to solve consensus, byzantine faults, ordering, consistency vs. availability and exactly-once delivery.

However, AI agents don't share these problems in the classical sense. Building agents is about context attention, relevance, and information density inside a single ordered buffer. The distributed part is creating an orchestrator that manages these things. At noetive.io we currently work on the context relevance part with our contextual broker Semantik.


Curious idea (found your hn post) but cant figure out what the use case is..

Endless, we offer SemQL to query in high-dim space using distance, direction and contrast predicates. It enables anyone to retrieve events that align with a predicate in global vector space.

Lesson learned: If your recovery plan requires calling any API in the dead region — to detach an IP, describe a route table, launch an instance, read an S3 object, or decrypt a volume — it will fail when you need it most.

Every dependency on the primary region is a dependency on the thing that just broke.


This is great! But what if the US invests 1% of GDP in GPU datacenters and then those are not needed becaues someone created a much more efficient architecture?


More efficiency just means more consumption. Think when they add lanes to a highway, traffic gets better for a little bit but very soon the highway is just as congested as before.


More people get where they’re going in the same amount of time though


Look up Jevons Paradox, when something becomes more efficient, consumption can goes up, often due to price elasticity.

Think of like this: Imagine car prices go from $200,000 to $$20,000 - you wouldn't sell 10x the amount of cars, you'd sell --- In fact I just looked up the numbers - worldwide only 100K or so cars are 200K & higher, whereas roughly 80 million cars are in that affordable category.

So a price drop of 90% allowed sales to go from 0.1M to 80M!! I think this means we need more engines, tires, roads, gas, spare parts.


Then they'll be able to use those datacenters much more efficiently.


They will still use capacity. Why would you believe anything different?


I share the view that nobody avoids an exit or migration because of egress fees. In fact, for online migrations the period of replicating data between providers might go on for months.

But all cloud providers leverage The Principle of data locality or data gravity, which states that compute benefits from being close to the stored data. If a customer moves the data elsewhere it follows that the compute will soon leave too.


The elephant in the room: Will Spotify survive the coming wave of generative AI audio content? Wasn't any mentioning on that in the letter.

Seems to me like they would need to ditch a lot of silly investments like original content, platform engineering on Kubernetes and scaled agile which together carry costs in the range of $100s millions, to free up resources to battle new disruptive technologies.


I would think so, people want to listen to swift and drake not some ML song with no relevance.


Tbh I would rather listen to a ML song than drake. At least the ML song would sound less robotic.


would need an example of this. drake has many (many) styles but i wouldn't describe him as robotic


It is an opinion man. They call it mumble rap for a reason.


Correct. It's also the case that human generated requests will lose their relevance within seconds, a quick retry is all it's worth. As for machine generated requests a dead letter queue would make more sense, poor engineered backend services would OOM and well-engineered would load shed, if the requests are queued on the application servers they are doomed to be lost anyway.


AWS is growing faster than they can staff, luxury problem.


I think there is another side to this story which is that referral marketing is having a bad name these days. It's considered fraudulent by many sellers since sometimes these referral sites rank higher than the sellers themselves. I suspect this is part of an effort from Google to put the sellers higher than the referrers in the search results.


PDI is just one tool in the toolbox. It's never gonna die.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: