Basically at one point they thought they had all the numbers that existed - whole numbers, negative numbers, fractions (ie ratios or rational numbers) and that was it. If something wasn't a whole number then they believed it could be written as a ratio.

But then you got Pythagoras, his famous theorem about the square on the hypotenuse, and someone asked "if you have a right angled triangle with both the short sides having length one, then what is the length of the hypotenuse?" and the quick answer it is 1 plus 1 and then square rooted ie root 2 but what IS that? What fraction?

And they buggered about for ages trying to find the relevant fraction until some clever clogs was able to prove that no fraction would possibly work ie the square root of two couldn't be written as a fraction so it was a number which was outside of all those they believed to exist. Allegedly the guy who discovered this was murdered for his heresy but I think that may be just a myth.

Anyway, as they went on they discovered more irrational numbers - yes pi etc - and then later Cantor showed that there couldn't be a one-to-one mapping between rational numbers and irrational ones, in other words there are in sense MORE irrational numbers than rational numbers, both are infinite in number but one infinity is bigger than the other.

So most numbers are irrational, there is an irrational number between any two rational numbers you name, in fact there is an irrational number between any two irrational numbers.... infinitely many even. But WHAT exactly are they? I Find them so hard to imagine cos each one can only be expressed as an infinitely long decimal expansion (or a name) but for me it makes it hard to pin them down.

Amazing. And to me this reflects certain limitations, which isn;t to imply a ceiling, to our mathematical mode of understanding.

Don't know the maths of Godel, but I have a basic grasp of the philosophy, that a system of knowledge cannot be consistent and complete, i.e. that any logically consistent rule has its blindspots, things it cannot account for. Such rules as the various categories of numbers you mention.

If we are talking about taxonomies of numbers, a nested taxonomy, whatever, then it would follow that every new category can only account for some subset of the numbers hitherto unnacounted for, no?

And clever definitions of new categories can delay the discernment of exceptions to those categories, but arguably not prevent them. Unless, perhaps, the rule is somehow inconsistent, in the interest of being complete, but that seems to run against the established grain of science as being a maximally consistent framework of knowledge.

You could classify all unclassified numbers under a genus of sui generis, in a paradoxical manner, but that seems useless.