Java has wrapper classes that (for the most part) do the same thing. E.g.
Integer num = 17;
String str = num.toString();
In .NET int "inherits" from Object in that when you cast a struct like int to Object; it wraps it in a reference type, which can be assigned to null. The main thing Java is missing (in this example) in this is that int isn't assignable to Object, but you can assign an int value in Java to an Object reference.
That's not to say Java isn't missing other features (which give c# more ways to do things, but also make it more complex). E.g. there is no pass by reference in Java, only pass by copy of reference (and primitives). And this is not to say that c# has everything that Java has. E.g. Enums in Java are reference based, while c# is based on a primitive type (Java lets you give different implementation of each enum value for a method). In Java you will only see instances of an enum that is explicitly listed, while c# you can see unlisted ones, or ones that are "or" together.
Ultimately, Java ends up being more verbose and easier to follow step-by-step logic, while c# is more compact, easier to write and express higher level concepts. If given a choice between c# and Java:
For "Nuclear launch missile control" - Java, its easier to mathematically verify that it is doing what is expected (In reality FORTRAN is the way to go)
For "Quick one time script" - C#, easier to write and you can avoid some boiler plate code that Java would require (Python/Ruby probably is better).