For Linux based OS (or Java virtual machines), i think, we always talk about TicksPerSecond = 1000. Isn't it ?
I agree with you this seems currently true, and that it is unlikely to change (ironically, probably because it would break too many programs that have assumed it will never change and have thus hardcoded it to 1000).
But technically: no. The fact that the .TicksPerSecond conversion factor is provided suggests that the way is being left open for it to be different if need be, eg if the system clock is not a multiple of 1000 Hz. I have seen what manufacturers will do to save a cent on a component, especially on large production runs, and crystals are a prime target for this because
near-enough-is-good-enough, right?
I think I got triggered by the word "always" that makes it sound like there is some immutable relationship between ticks and milliseconds, and that the two are interchangeable, but... they're not. It is similar to saying that litres and kilograms of water are the same, or that mass and weight are the same thing - in typical day-to-day life experience they are - but not always. And sometimes the difference matters, and that is when we don't want people thinking that eg ticks and milliseconds are the same thing.