The only use for LLM in coding is as an alternative search bar for stackoverflow
I’d argue it can also be useful as a form of autocomplete, or writing whatever boilerplate code; that still isn’t outsourcing your thinking to the text predictor.
When I tried the autocomplete in IntelliJ it kept trying to guess what I wanted to do instead of autocompleting what I was typing so I don’t know about that part.
Still millions of ton of CO2 for a search bar and autocomplete doesn’t seems like a good idea.
Probably depends on what you do. I haven’t used AI autocomplete myself, so I can’t talk from experience, but what I had in mind was the somewhat repetitive work I’ve been doing recently with gui widgets. I expect an LLM to get that mostly right.
I’d argue it can also be useful as a form of autocomplete, or writing whatever boilerplate code; that still isn’t outsourcing your thinking to the text predictor.
When I tried the autocomplete in IntelliJ it kept trying to guess what I wanted to do instead of autocompleting what I was typing so I don’t know about that part.
Still millions of ton of CO2 for a search bar and autocomplete doesn’t seems like a good idea.
Probably depends on what you do. I haven’t used AI autocomplete myself, so I can’t talk from experience, but what I had in mind was the somewhat repetitive work I’ve been doing recently with gui widgets. I expect an LLM to get that mostly right.