I wish I could have some installed, but alas, I live in an apartment building on a high floor, and I don’t get to make the call.
First line of defense: blocking out sunlight in all windows during the day
Second line of defense: highly active drafting, creating a cross-breeze when the outdoor temperature is lower than the indoor temperature
Third line of defense: Fan, reduces perceived temperature significantly
Fourth line of defense: Acclimatization, warm showers before bed (supposedly helps)
Fifth line of defense, in case everything else fails - basically a heatwave: portable AC
Deadnaming Twitter and the Gulf of Mexico is praxis
A quick journey through the microcosm that is the toxic world of fitness influencers.
I wish it was spelled out more clearly in the video, but please don’t approach fitness the way this guy did.
Sweden: Late Spring/Early Summer/Early Autumn, approximately May, June and September.
Temperatures between 15-25 °C, low humidity and lots of hours of daylight (18 hours in early June). Great conditions for biking and just all-round pleasant to be in.
Early Spring is too wet, Late Summer is too hot and humid, and Late Autumn is too wet and dark. Winter sucks, unless it’s an unusually cold year and we get consistent snow coverage. Wet and extremely dark.
A few not yet mentioned:
And a vote for previously mentioned podcasts:
I don’t think DeepSeek has the capability of generating code and executing it inline in the context window to support its answers, in the way that ChatGPT does - the “used”-part of that answer is likely a hallucination, while “or would use” more accurately represents reality.
The concern is that the model doesn’t actually see the world in terms of distinct hexadecimals, but instead as tokens of variable size - you can see this using the tiktokenizer-webapp: enter some text and it will split it into the series of tokens the model actually will process.
It’s not impossible for the model to work it out anyway, but it is a reason for this type of task to be a bit harder on LLMs.
It’s not out of the question that we get emergent behaviour where the model can connect non-optimally mapped tokens and still translate them correctly, yeah.
It is a concern.
Check out https://tiktokenizer.vercel.app/?model=deepseek-ai%2FDeepSeek-R1 and try entering some freeform hexadecimal data - you’ll notice that it does not cleanly segment the hexadecimal numbers into individual tokens.
Still, this does not quite address the issue of tokenization making it difficult for most models to accurately distinguish between the hexadecimals here.
Having the model write code to solve an issue and then ask it to execute it is an established technique to circumvent this issue, but all of the model interfaces I know of with this capability are very explicit about when they are making use of this tool.
Is this real? On account of how LLMs tokenize their input, this can actually be a pretty tricky task for them to accomplish. This is also the reason why it’s hard for them to count the amount of 'R’s in the word ‘Strawberry’.
I doubt this was intentional, but in any case it’s accurate
Without actually knowing how much constructing the physical buttons cost, I would guess that the real savings are in process optimization - if all you have for the interface is a screen, then you don’t need to have the interface design done before constructing the car - you can parallelize these tasks.
Insufficient as far as justifications go, but understandably lucrative.
Anything Turing-complete is a powerful tool, but the reason people are reacting negatively is because of how much of the wrong tool it is.
Basically the only saving grace of Excel-based solutions is that they are built in tools that finance workers comprehend, and that is quite simply not enough. To base systems at this scale on Excel is criminally negligent.
Implying the vast majority of their voter base can even afford Tesla’s.
The ones that do would have to forgo owning a huge lifted truck which I guess would also be a benefit.
It’s still not going to even come close to bail Elon out of the situation he’s in on account of so many big export markets now refusing to buy his shit, so he’s fucked regardless. No-win for him, all-win for us
I wish, but that’s not it. It’s just reactions to the business world realizing Trump isn’t actually going to do good things for them basically
Dude has a crazy distinct Swedish accent
Brutally ‘solving’ a problem entirely caused by the U.S, all while trying to demand gratitude from the rest of the world.
U.S foreign intervention in a nutshell, really