Are there any performance difference between decimal(10,0) unsigned type and int(10) unsigned type?
decimal(10,0) unsigned
int(10) unsigned
I doubt such a difference can be performance related at all. Most of performance issues tied to proper database design and indexing plan, and server/hardware tuning as a next level.