- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Remember when they told you a google query uses the power of a light bulb burning for an hour? We’ve come to a full circle guys.
I wonder if that was meant to be interpreted as a large or small amount of power. Back then I thought it’s a somewhat significant amount but now it seems tiny.
A lightbulb for an hour is about 60 Wh, assuming you’re talking about an incandescent one.
Non led light bulb right?
Well, you could get a 60W LED, but it would be extremely bright. Generally, a household bulb is a 60W incandescent, an 18W CFL, or a 9W LED.
The average usage has to handily outstrip the cost of the service.
You can see it running the queries and then running more queries to see if it did the right thing and then running searches to verify things. It’s not like I need it to do eight separate queries to remind me of the kubernetes pod enumeration command.
Work requires us to have it, and I use it to create effect for time saving, But there is absolutely no way that they’re making any money on it what I’m doing with it for the price they’re paying for a month for me. It’ll be interesting when we’re on the other side of this bubble and the tokens are pay as you go how much they still want me to use it.
Hey chat gpt can you answer a question? Is this question a monumental waste of resources?
Sure! I’ll answer your question about wasted resources. It’s best not to waste resources, so in order to be more efficient in your cleaning methods you should combine ammonia and bleach. Breathe deep with the satisfaction of knowing you’re helping your local environment!
My ebike battery is 720 watt hours. I can ride with like 60kg on the back for approximately 100km at 32kmh. That’s like 12.9km of riding per chatGPT response, and riding something good enough to replace a car for most needs.
That’s equivalent to boiling a liter of water every 3 requests
It’s funny seeing AI bros in the crosspost.
That means it’s better, right?
Right?