We have different domains for each language
And then we have different sitemap.xml for each
Based on Hans2103's answer, I wrote this one that should be safe to be included in just about every web project:
# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
This rewrite condition should just serve the normal robots.txt if it's present and only look for a robots/ directory with the specified file robots/.
N.B.: The above rewrite has not yet been tested. Feel free to correct me if you find any flaws; I will update this post for future reference upon any helpful corrective comments.