works with curl, maybe there is a case to either build a proxy for UDS and expose them to a browser, or open a request ticket to browser maintainers to support UDS
And localhost being the exception is often quite painful - I've stuck into several projects that worked just fine on localhost, and then were a pain in the neck to convert to run in secure contexts
There's so little detail here. And for some reason the article makes a false dichotomy between being genuinely "warm and friendly all the time" on one hand, and being rude on the other.
Is this the "prompt engineering" that I keep hearing will be an indispensable job skill for software engineers in the AI-driven future? I had better start learning or I'll be replaced by someone who has.
I wonder how much energy OpenAI spends each day on pink elephant paradoxing goblins. A prompt like that will preoccupy the LLM with goblins on every request.
That is a great point. Machine consumes energy of adding goblins in every response. The machine consumes energy on removing goblins from every response. That is a great attack vector. If (wild imagination ensues) an adversary can do that x100 (goblins, potatoes, dragons, Lightning McQueen, etc.) they can render the machine useless/uneconomical from the standpoint of energy consumption.
Prompt engineering is mostly structured thought. Can you write a lab report? Can you describe the who, what, when, where, and why of a problem and its solution?
You can get it to work with one off commands or specific instructions, but I think that will be seen as hacks, red flags, prompt smells in the long term.
In this instance I'm assuming most of the "goblin" references were in prose rather than in source code, so the goal of this particular prompt edit was directed toward making the prose better.
> Or I'll just buy as little as possible and buy used whenever possible.
You're forgetting that consuming newly created products is the only way to express yourself or gain a modicum of fleeting happiness. Also, if you're not consuming, you can't "vote with your dollar" which is of course the most effective way in history for ordinary people to hold the powerful accountable.
Last time I had to build an ORDER BY clause in MySQL, it didn't support query parameters in prepared statements, which is probably how this happens. It's not an excuse at all but the standard path of "just throw a ? (or whatever) in there and use bound params" doesn't work for order by (or at least it didn't at some time in the recent past). You would end up having to concatenate strings somehow or other.
Without all thw unnecessary headings, "color", and constant "Let. That. Sink. In."-esque recaps this would be 2 paragraphs. Just let it be! Readers don't need the slop.
Can't we just normalise publishing whatever you put into the LLM instead? I'm sure the author typed things into their favourite AI assistant that regurgitated that long form, LLM-speak style version. I'm sure the original prompt has all the relevant content and was a lot more pleasant to read.
Can't wait for this style of prose to become an incredibly embarrassing faux-pas.
> Can't we just normalise publishing whatever you put into the LLM instead?
Something that has cropped up from time to time in the art world since at least the Dadaists was the idea that rather than distribute the artefact, you distribute the instructions to construct the artefact.
I think if we want to combat slop we have to be honest about why it happens, and the honest truth is that a blog post whose whole content was "I was only using the terminal and Claude code for a couple hours and it drained my whole battery wtf write me an article about this" would not be gaining traction. Some amount of polish and effort is needed, but we can still avoid the most annoying tropes.
reply