In Standard ML the problem is decidable, it's just that the worst case is that the derived type is exponential in the length of the program (as a directed acyclic graph, and double exponential as linear output).
So one needs a bit less than Turing oracle to fix the compiler, at least for Standard ML ;-) Some other type systems are actually undecidable.
But at the same time, I am not sure that Turing oracle by itself can be used to reduce the complexity of the computation involved to make it fit a real computer -- I am just not familiar enough with the theory of computations with such oracles.. For example, if one uses such an oracle in a straightforward way, as applied to this problem, the oracle always says "yes" (a large enough computer will always complete type inference of a Standard ML program given enough time), and this does not help us to reduce the resources needed to figure out the outcome of the type inference process.
no subject
So one needs a bit less than Turing oracle to fix the compiler, at least for Standard ML ;-) Some other type systems are actually undecidable.
But at the same time, I am not sure that Turing oracle by itself can be used to reduce the complexity of the computation involved to make it fit a real computer -- I am just not familiar enough with the theory of computations with such oracles.. For example, if one uses such an oracle in a straightforward way, as applied to this problem, the oracle always says "yes" (a large enough computer will always complete type inference of a Standard ML program given enough time), and this does not help us to reduce the resources needed to figure out the outcome of the type inference process.