That’s what capitalism is.
Anarchists still believe in trade.
That’s what capitalism is.
Anarchists still believe in trade.
Yeah that brings us up until this tweet pretty much, afterwards I got a ping from AP saying Israel had agreed to turn round after speaking to Trump and Trump tweeted some dumb shite like they were just going to do a nice plane wave
Did I miss something? I think Israel did say they turned around after speaking to him
Salvia if you’re brave. World might fold in on you briefly and it’s legal because nobody has fun on it lol. But it is strong as shit and will certain fuck up your perception for a few minutes.
Most of the other legal things are pretty naff and will probably just make you feel a bit sick and fuzzy around the edges (morning glory seeds).
Depending on how strict the laws are in your area there might be some loopholes for exotic psychs but probably not the best entry. Probably best just going looking for some mushrooms, they won’t show on a standard panel.
Using it like that sounds more American.
Uk loves to binge. Take a couple dozen pills each over a long weekend and people will start talking absolute nonsense. Lots of weed and coke mixed in too but seemed to be mostly the mdma and sleep deprivation that triggered it.
Small stuff like them continuing a conversation with you that you weren’t having, and then acting like a dementia patient when you correct them. To walking in on someone having a full blown conversation with a laundry detergent bottle.
No set name for the usual level of hallucinations that weren’t delirium. Usually just say something like out my tits/box, full of it, completely fucking spangled, etc.
We seem to use spatial reasoning to compensate for episodic memory.
If I try and remember something, it’s usually my position in the room I remember first. And instead of remembering a picture of an elephant we store the dimensions.
And the reading is because you don’t have to say it aloud in your head, most people only read as fast as they can talk.
Most psychs don’t show up on a panel just find out what the panel tests for.
I’ve taken every exotic research chemical and psychedelic you can think of. I can confirm hallucinations work the same with aphantasia.
Although I didn’t ‘trip’, which is the delusional state people get into when they take pills/mdma and stay up for a few days. Start talking to plastic bags, on the phone with their hand, etc. might just be me though.
Welcome to the club. In that case you might have SDAM too (hope not!)
https://sdamstudy.weebly.com/what-is-sdam.html
On the plus side we get a boost to abstract thinking, spatial reasoning and speed reading (if you also don’t have an involuntary monologue when reading).
I don’t really dream much but my watch says my REM is fine.
Cutting out weed after a stint gives me more dreams than usual, but then cuts back to my baseline once in a blue moon after a while.
Take lots of magnesium, have always been like this. Also have aphantasia though so not much to my dreams to remember.
I have aphantasia so don’t really have full fledged scenic dreams with a narrative like some people have.
It’s more like I see my daughter crawling and falling into the plug socket so I need to go in after her, and then I’m suddenly in a field full of wasps.
I don’t ‘see’ much, it’s more like flashes of images and emotions; and I’ll often open my eyes and talk or shout but still be asleep mentally.
Removed by mod
Delete your account.
There’s an example right in the article.
Historical events portrayed realistically is one.
She hasn’t prevented fascism so her efforts have been fruitless? Kinda nonsense argument is that?
She started a worldwide youth climate strike movement involving millions in 150+ countries over thousands of strikes. Encouraged global youth and political reforms, even influencing local government decisions in places like the U.S, credited with inspiring the “Greta effect”: shifts in public attitudes and behaviors on climate action. Made flight-free travel a trend, boosting the “flygskam” (flight shame) movement in Scandinavia, which reduced domestic flights in Sweden by ~9% in 2019. Put actual pressure on the systems at large and has impacted their decision making. List goes on. These are deep systemic changes she’s successfully pushing.
With no backing other than having a spine. MAGAts love to pretend her parents pushed her into it but it was the other way about.
She has achieved far more than you ever will
You’re absolutely right that inference in an LLM is a fixed, deterministic function after training, and that the input space is finite due to the discrete token vocabulary and finite context length. So yes, in theory, you could precompute every possible input-output mapping and store them in a giant table. That much is mathematically valid. But where your argument breaks down is in claiming that this makes an LLM equivalent to a conventional Markov chain in function or behavior.
A Markov chain is not simply defined as “a function from finite context to next-token distribution.” It is defined by a specific type of process where the next state depends on the current state via fixed transition probabilities between discrete states. The model operates over symbolic states with no internal computation. LLMs, even during inference, compute outputs via multi-layered continuous transformations, with attention mixing, learned positional embeddings, and non-linear activations. These mechanisms mean that while the function is fixed, its structure does not resemble a state machine—it resembles a hierarchical pattern recognizer and function approximator.
Your claim is essentially that “any deterministic function over a finite input space is equivalent to a table.” This is true in a computational sense but misleading in a representational and behavioral sense. If I gave you a function that maps 4096-bit inputs to 50257-dimensional probability vectors and said, “This is equivalent to a transition table,” you could technically agree, but the structure and generative capacity of that function is not Markovian. That function may simulate reasoning, abstraction, and composition. A Markov chain never does.
You are collapsing implementation equivalence (yes, the function could be stored in a table) with model equivalence (no, it does not behave like a Markov chain). The fact that you could freeze the output behavior into a lookup structure doesn’t change that the lookup structure is derived from a fundamentally different class of computation.
The training process doesn’t “build a Markov chain.” It builds a function that estimates conditional token probabilities via optimization over a non-Markov architecture. The inference process then applies that function. That makes it a stateless function, yes—but not a Markov chain. Determinism plus finiteness does not imply Markovian behavior.
Yes, LLM inference consists of deterministic matrix multiplications applied to the current context. But that simplicity in operations does not make it equivalent to a Markov chain. The definition of a Markov process requires that the next output depends only on the current state. You’re assuming that the LLM’s “state” is its current context window. But in an LLM, this “state” is not discrete. It is a structured, deeply encoded set of vectors shaped by non-linear transformations across layers. The state is not just the visible tokens—it is the full set of learned representations computed from them.
A Markov chain transitions between discrete, enumerable states with fixed transition probabilities. LLMs instead apply a learned function over a high-dimensional, continuous input space, producing outputs by computing context-sensitive interactions. These interactions allow generalization and compositionality, not just selection among known paths.
The fact that inference uses fixed weights does not mean it reduces to a transition table. The output is computed by composing multiple learned projections, attention mechanisms, and feedforward layers that operate in ways no Markov chain ever has. You can’t describe an attention head with a transition matrix. You can’t reduce positional encoding or attention-weighted context mixing into state transitions. These are structured transformations, not symbolic transitions.
You can describe any deterministic process as a function, but not all deterministic functions are Markovian. What makes a process Markov is not just forgetting prior history. It is having a fixed, memoryless probabilistic structure where transitions depend only on a defined discrete state. LLMs don’t transition between states in this sense. They recompute probability distributions from scratch each step, based on context-rich, continuous-valued encodings. That is not a Markov process. It’s a stateless function approximator conditioned on a window, built to generalize across unseen input patterns.
Be careful, radical centrism is the worst kind of extremism and dulls your surroundings.