I can't talk to the plans of the BaseX team (just hope they keep it up!), but I can relate a few anecdotal thoughts from our own experience. We use BaseX as part of a large data analysis system for the US DoD and have been very pleased.
How much load can BaseX take because the xml file content will be say 1 to 2 mb max, but there will be millions of records.
We're successfully storing and querying hundreds of documents with an average size of about 5 - 10 MB each (though some get into the tens of MB). The performance has been great (some queries take a little time depending on complexity, but it's a lot of data to sift through). We're not using a client-server architecture though, so I can't talk to the load handling capabilities or performance of the distributed components.
Is there any .Net api too (not a constraint but just curious).
We've had very good success cross-compiling to .NET using IKVM. We can run the cross-compiled libraries on Windows, Linux, and Solaris using Mono (it works just fine on the MS CLR too). What we ended up doing was cross-compiling the stock BaseX Java code, customizing the assembly a little using a post-processing tool we wrote with Mono.Cecil (to make some internal things public, etc.), and wrapping the whole thing in an abstraction layer that makes it more .NET friendly and exposes more "direct" styles of working with the database as if it were a collection of XmlDocument objects. One of the cool things about this approach is that we can then feed the XmlDocument-derived classes to the .NET version of Saxon and run XSLT transformations directly against the database.
Unfortunately I can't release the code (my company is considering commercializing and open sourcing it under some kind of GPL license in the future), but I figured knowing that something like this was possible might help.
Dave