Things can always be boiled down. But the more you boil it down, the more previous knowledge you assume people have in common, and it might happen that they don’t understand because they didn’t have knowledge you assume they had.
Conversely, if you include too much information, not boil it down enough, you lose people by being bored of getting told things they already know.
In any case, it’s not an easy balance and I’m just doing my best here to guess which amount of “boiling down” is “just right”. But thanks for your comment, I’ll take it into consideration.
Well, pre-2000 is quite a strong limitation here. In the last 25 years in programming, basically everything changed. It’s hard to find anything older than 25 years that’s even still relevant.
But I would say Lisp, or what it brings, mainly the ability to do meta-programming, using code to change/generate code. It basically solves what AI is being used now to solve, namely generating boilerplate code. In many languages, there is just so much shit you have to write to get to the actual creating a solution, problem solving part, which you can very cleanly circumvent with meta-programming, greatly reducing the mental load necessary to understand programs if used correctly. But, like many things, it’s hard to use, easily misused, and thus requires you to be very smart about it. Many programming features and conventions and so on attempt to basically safeguard you from incompetent programmers, or rather allowing you to work with incompetent programmers without them being a detriment more than a benefit. Needing to decipher arcane macros is quite challenging indeed.
There are a couple of Lisps newer than 2000, like Clojure, which I would have mentioned without your limit, and which I’m now circumventing by talking about what the limit prevents me to do.