Maybe some kind of bot detection is done in the background, and humans are able to pass the check easily when they click on the button. Crawlers, on the other hand, would have a harder time getting through.
I guess this is to prevent others from reposting their content easily.
Perhaps to mess with bots that scrape content?
Maybe some kind of bot detection is done in the background, and humans are able to pass the check easily when they click on the button. Crawlers, on the other hand, would have a harder time getting through.
I guess this is to prevent others from reposting their content easily.