Thinking specifically about AI here: if a process does not give a consistent or predictable output (and cannot reliably replace work done by humans) then can it really be considered “automation”?
Thinking specifically about AI here: if a process does not give a consistent or predictable output (and cannot reliably replace work done by humans) then can it really be considered “automation”?
I quite literally stated that it matters in some cases and not in others.
If the process is the purpose, then it matters. If the end product is the purpose, then it largely doesn’t.
I am saying that we can no longer meaningfully separate the two things.
Cognition and AI? We absolutely can, just because some people fail doesn’t mean it’s intrinsic.
No, I’m saying we can no longer meaningfully separate the product and the process.
We do by default in capitalism, that’s the basis of commodity fetishism.
That’s not a rejection of what I said, so I assume you agree.
The product exists as a use-value. How it is created does not matter for the user of the use-value unless the process was the use, ie art. Labor is all that matters from the worker’s perspective, they get none of what they create unless they are paid in kind. What you appear to be arguing is that labor involving AI is an almost supernatural corrupting force, like a calculator.
Ah to be fair i did misinterpret your previous statement.
But no, I am arguing that we are not able to ignore knowledge of the production process. Nothing mystical about that.
What does “ignoring knowledge of the production process” even mean? Who is doing the ignoring? What knowledge are we talking about? You’ve said before that using AI is instrinsically damaging, but have only shown proof that AI can be misused if we don’t understand its limitations, a sentiment the article you linked echos exactly but you appear to disagree with.