Why do user-defined string literals and integer literals have different behavior?
I'm learning about user-defined literals, and confused with the following test code: std::chrono::seconds operator"" _s(unsigned long long s) { return std::chrono::seconds(s); } std::string operator"" _str(const char *s, std::size_t len) { return std::string(s, len); } int main() { auto str = "xxxxx"_str; std::cout << str.size() << std::endl; // works auto sec = 4_s; std::cout << sec.count() << std::endl; // works std::cout << "xxxxx"_str.size() << std::endl; // works std::cout << 4_s.count() << std::endl; // does **NOT** work! return 0; } The compiler gives the following error message: error: