Huh. You made me think .
I had an idea today while thinking about this problem. Its simple: Zero isn't a number. Its a symbol, a visual aid to represent a concept. Its not an actual number. We invented it as a useful place holder so that we didn't have to invent a new symbol for 10, 50, 100, etc, like most ancient people did. Think about it, it litteraly represents the lack of a number, not a number itself. Its own properties conflict with the properties numbers have. Take any number and add zero, it doesn't change because zero isn't a number. Its like not adding in the first place. Take an number and multiply it by zero, you get nothing because you litteraly take that number zero times, like not taking it at all. You can't have zero of something, because if you did its the same as not having it at all anyways. Its because zero is a symbol, not a number, it gets confusing when you treat it like one. Its like trying to dividing 1,000 by the at (@) symbol. It just doesn't make sense. So its no wonder what happens when you try to find out how many times nothing fits into something, even if that something is nothing in the first place. It can't be done.