How do I extract the domain name from a url using bash? like: http://example.com/ to example.com must work for any tld, not just .com
Instead of using regex to do this you can use python's urlparse:
URL=http://www.example.com
python -c "from urlparse import urlparse
url = urlparse('$URL')
print url.netloc"
You could either use it like this or put it in a small script. However this still expects a valid scheme identifier, looking at your comment your input doesn't necessarily provide one. You can specify a default scheme, but urlparse expects the netloc to start with '//' :
url = urlparse('//www.example.com/index.html','http')
So you will have to prepend those manually, i.e:
python -c "from urlparse import urlparse
if '$URL'.find('://') == -1 then:
url = urlparse('//$URL','http')
else:
url = urlparse('$URL')
print url.netloc"