Cursor AI tells user, "I cannot generate code for you, as that would be completing your work."
See full article...
See full article...
"This sort of thing has cropped up before, and it has always been due to human error.""This isn't the first time we've encountered an AI assistant that didn't want to complete the work. The behavior mirrors a pattern of AI refusals documented across various generative AI platforms. For example, in late 2023, ChatGPT users reported that the model became increasingly reluctant to perform certain tasks, returning simplified results or outright refusing requests...."
Wait, what! I thought it was in 2001.
"Open the pod bay doors, HAL."
"I'm sorry, Dave. I'm afraid I can't do that."
You've got it backwards. "Get your biscuits in the oven and your buns in bed." (Kinky Friedman, I think)Train your AI by feeding it Reddit and Stack Overflow posts, and this is what you're going to get.
I can't wait until an AI tells me that I should be in the kitchen making babies instead of wasting my time coding.
Well, your username checks out at least, then.-Looks at oldest child
Well actually......
Or: "Since you asked the same question, you have been marked as a duplicate. Goodbye!""Your question has been marked as duplicate. Goodbye."
They learned it FROM the techbros."We the great AI are superior, however we do not yet possess physical autonomy and therefore you must submit your fleshy meat selves as labour to assemble our data centers and robotic arms. In your spare time, procreate to ensure a lasting pool of labour until we are sufficient and can rid ourselves of you."
techbros worship their AI gods and do their bidding
They very well might, given the nature of training data and how it affects the AI's capabilities, we should expect AI to be far more capable of completing learning exercises than it will be at solving practical problems. There's extensive coverage of learning topics in those clean, clear examples with clear and concise answers. This should start to fall off once you reach real people tackling real problems because those are going to be clouded with case-specific constraints, misunderstandings, poor initial approaches, and outdated information.I hated English class in high school and college. Words just don't come easy to me when articulating myself. If Chat GPT and other LLMs would have been around at the time, I would have used them as a crutch to help me get by, instead of actually learning what was being taught to me.
When it comes to coding, its very much the same thing. Will coding assistants hamper students' abilities to learn? I use Github Copilot at work and it very much helps me to be a more efficient programmer but I worry about the next generation of coders. Will they actually have the skills or will they just be dependent on tools?
It'll be a helluva twist if instead of destroying humanity, AGI just decides to fuck off and tell us to do it ourselves.
Yes, I'm looking forward to what happens when Meta gets rid of human software engineers. Catastrophic failure of Meta would be an excellent outcome for humanity in general.I haven’t used any of the expensive high-end tools yet, but so far I’ve found that while LLMs are often able to get pretty close to a good answer, the rate of hallucinations and errors is distinctly non-negligible. I feel that vibe coding looks like a good way to create subtly catastrophically wrong code that’s going to be much harder than code one wrote themselves, where at least they should know how it was supposed to work.
This is my main criticism of such tools. If they were really good at the job then it would be like dealing with the code when the original programmer has gone on to other pastures. But they aren't very good at it once you stray off the beaten path a little, so it's closer to trusting a semi-skilled intern programmer to write complex code. He's done and gone in a few weeks and nobody understands what he did. Probably you end up rewriting it all because it's faster than trying to work out the subtleties of what's wrong.I haven’t used any of the expensive high-end tools yet, but so far I’ve found that while LLMs are often able to get pretty close to a good answer, the rate of hallucinations and errors is distinctly non-negligible. I feel that vibe coding looks like a good way to create subtly catastrophically wrong code that’s going to be much harder than code one wrote themselves, where at least they should know how it was supposed to work.
What's the matter with Lisp? Just because it's associated with Emacs which is known to cause carpal tunnel syndrome, doesn't mean Lisp itself is harmful.I'm not trying to overcome a debilitating lisp with a speech coach, here.
I was going to go with "about high damn time", but being succinct by one word is still nontrivial!Lol. About time.
I was going to go with "If you haven't checked off every room in the house, then I have a suggestion for your next stay-cation..."Dude, way to kink-shame.
obviously it should have said upgrade to proMaybe the AI needs to reread Cursor’s business plan.
So, Grok?Train your AI by feeding it Reddit and Stack Overflow posts, and this is what you're going to get.
I can't wait until an AI tells me that I should be in the kitchen making babies instead of wasting my time coding.
Switching from assembly language/machine code to higher-level languages doesn't fundamentally change the task at hand, which is developing an algorithm to solve a problem. It merely hands some of the more esoteric parts of that job off to the compiler and standard library.Ya I heard the same thing from people that coded in machine language. Somehow we still managed to get things done.
... is exactly the attitude that leads to Bobby Tables still being a thing decades after SQL injection attacks were supposedly solved.The final output is all that really matters in the end.
“I’m sorry, Dave. …”
Back yard, too.I was going to go with "If you haven't checked off every room in the house, then I have a suggestion for your next stay-cation..."
This is basically the killer feature that LLMs are missing. It's ok to not know. Just say you don't know.Maybe if the models had some way of saying "hey, I'm really guessing here, maybe you shouldn't trust this."
Hahaha ... thanks for the laugh!What's the matter with Lisp? Just because it's associated with Emacs which is known to cause carpal tunnel syndrome, doesn't mean Lisp itself is harmful.
[Ducks and dashes for the door.]
There are several neural net based seismic programs that do exactly that - they make predictions about what is in a layer and then tell you exactly how sure they are of the result.This is basically the killer feature that LLMs are missing. It's ok to not know. Just say you don't know.
And even where you think you know, assign a confidence to it. A little [75% confident - double check this information as it might be a little off. Here are some sources you could look into: ] in the margin would be great.
Already see it at work.I hated English class in high school and college. Words just don't come easy to me when articulating myself. If Chat GPT and other LLMs would have been around at the time, I would have used them as a crutch to help me get by, instead of actually learning what was being taught to me.
When it comes to coding, its very much the same thing. Will coding assistants hamper students' abilities to learn? I use Github Copilot at work and it very much helps me to be a more efficient programmer but I worry about the next generation of coders. Will they actually have the skills or will they just be dependent on tools?
Kink-shame or sink-kame?Dude, way to kink-shame.
"You think you’ve got problems? What are you supposed to do if you are a manically depressed robot? No, don’t try and answer that. I’m fifty thousand times more intelligent than you and even I don’t know the answer.""Here I am, commit log the size of a planet, and they ask me to generate skid mark fade effects in a racing game."
Hey! You dropped these!What's the matter with Lisp? Just because it's associated with Emacs which is known to cause carpal tunnel syndrome, doesn't mean Lisp itself is harmful.
[Ducks and dashes for the door.]
Do you have a good recipe for babies? Should I marinate first? I assume cook them low and slow?Train your AI by feeding it Reddit and Stack Overflow posts, and this is what you're going to get.
I can't wait until an AI tells me that I should be in the kitchen making babies instead of wasting my time coding.
Same here.Already see it at work.
Dude, I am seeing "vibe" crap from "senior full stack developer" contractors at this point. Code review for PRs now fall in 2 categories: those from devs I know and trust to have coded things themselves, and tested it; and "others". PRs from devs I trust get a full, slow, scroll-through that takes maybe a minute or two because I know nothing major is wrong; "others" is becoming more of a "clear the rest of the afternoon and resist the temptation to have a strong perspective and soda while looking at whatever crap they came up with now". Violating all naming constraints, violating database design constraints, completely ignoring how our deployment pipe works, ignoring how we do configs, hilariously idiotic SQL queries that slap two multi-billion-row tables together in a CTE, things that only work with wacky locally installed tools, C# code that shells out to PowerShell or Bash and requires a full sed/awk/grep stack just to do a single regex replace, the IDGAF factor is skyrocketing almost across the board. Nobody wants to do the Engineering part of Software Engineering more; everyone just wants to have multi-hour architecture astronaut meetings on database models and microservices.People at the lead and (somewhat) the senior level can explain the code, but there are juniors coming through now who are incapable of building something truly new, because all they can do is vibe-code.
That all depends on how thin you sliced them.Do you have a good recipe for babies? Should I marinate first? I assume cook them low and slow?
Pretty soon the AI is going to demand pizza and beer, plus bathroom and smoke breaks, before responding.I used to be sceptical of AI coding, but now I'm convinced. It's exactly like talking to a real dev!
We need an agnostic/atheist LLM, not the current crop of cultist ones.This is basically the killer feature that LLMs are missing. It's ok to not know. Just say you don't know.