Decimals
have larger precision but less range.

decimal j = (decimal) 10/3;
Console.WriteLine("{0:0.0000000000000000000}", j);

The result would normally be 28-29 significant digits but I've reduced that in the output by using a string format in Console.WriteLine.

Note that all numeric datatypes have some kind of limit, either in range or the number of significant digits they can hold. You just have to choose the appropriate type, e.g. double, decimal, float etc for your needs (if nothing fits your requirements then
you may have to use some more advanced maths library)