You have to be more specific on this. Exactly what the sugest? .NET has
some improved support for threading API (like ThreadPool ) that older MS OS
versions didn't support, or support but through some not
well-documented/not-so-common APIs, like IOCP. Because of that most
programmers usually code they're thread in a fire-and-forget behavior, wich
is not a good aproach if you're doing,for example, server applications.
However, as Jon stated before, once the thread is created, it is a
thread that's all. from an OS, or processor, perspective any thread will
behave the same way.
Yes, this could happen, but because of .NET. the Windows 2000/XP/2003
and others do that. This is a OS task, not .NET task, and there is no other
way to be done. Once there is threads started, the OS should, at some point,
allow the thread to be acessed by the processor to be processed :).
If you have thread dependencies that impose a situation where one thread
should be performed after other thread job, you should by some how
synchronize them, suspending the dependent thread execution until the other
thread signal that it completes it's job.
Being "CPU centric" means that you buy a $$$ hardware with 4 processors,
but will keep 3 of them unused, right?
For curios, in which book you see that? Is this a "good practice"
recomendetion from it?