Shrug. We don't really care about measuring it at any given point in time. We're only describing probabilities over time. It's enough to know that it must eventually happen to say entropy must decrease at some point.All we can do is say what the probability of the value of q: P(q) = N(q)/(N(1)+N(2)+N(3)+N(4)). To actually know the value of q you have to perform a measurement, and measurements are not free, which goes right back to my previous post.
If you were obsessed with details, you would ask, How many ways can I have exactly 1000-1101? Just one...With that additional knowledge the same state suddenly has low entropy.
Sure, and if you identify every atom in each state you can say one arrangement of the atoms in a chamber of mixed gasses is as likely as any other, including one where the atoms spell out your name. The math works out the same on how often you will see your initial separated state. At that level of detail, we can say any state you pick is equally unlikely, but there are so many possible states you will have to wait a very long time to see your name, or any particular state you pick.
Of course, there is no entropy if you make every state equally meaningful. All arrangements have equal order, all arrangements have equal probability. But some arrangements have particular consequences, and we can assign meaning on that basis, at an aggregate level, without being subjective.
For instance, if you have a lot of heat in one place and very little in an adjoining space, you can do work. It's very unlikely the heat will do anything but flow to areas of less heat, so you can use the difference in heat to drive a piston or etc. Of course, you could say all the possible distributions of energy in the system are equally meaningful and equally probable, but only a tiny subset of them allow you to do work, and we say those states have lower entropy because there are relatively few such states among the total possible states, and we say distributions that do little to no work are high entropy, because there are more of them (even though each individual high-entropy state is no more or less common than any individual low-entropy state). So the distinctions between low and high entropy can be more than arbitrary.
This is why the entropic distinction is between (0/4,4/0) and (2,2) rather than between single states like 0110-1001 vs. 1001-1100. Entropy is only meaningful when you talk about distributions rather than single states.