But your case is wrong anyways because i <= INT_MAX
will always be true, by definition. By your argument <
is actually better because it is consistent from < 0
to iterate 0 times to < INT_MAX
to iterate the maximum number of times. INT_MAX + 1
is the problem, not <
which is the standard to write for loops and the standard for a reason.
Just looking at the numbers, they are spending $5G and losing $1G. Their subscriptions are growing. So if they grow another 25% they are making money. (Ignoring infrastructure costs which are most likely a tiny fraction of per-user revenue.) They also just launched an Android app. So I think their story is looking pretty good. Not even considering that it raises the value of Apple TV hardware, their other devices and gives them more lock-in for customers in general that seems like a great investment they made.