I am trying to stick to the practice of keeping the database normalized, but that leads to the need to run multiple join queries. Is there a performance degradation if many quer
There is a cost to decomposing tables for the sake of normalization. There is a performance component to that cost. The performance cost of decomposing tables and joining data in queries can be kept low by: using a good DBMS; designing tables right; designing indexes right; letting the optimizer do its job; and tuning the DBMS specific features of physical design.
There is also a cost to composing large tables that materialize joins. The cost in terms of update anomalies and programming difficulties is outlined in good tutorials on normalization. There is also a performance cost to composing tables. In many DBMS products, loading a very big row into memory costs more than loading a smaller row. When you compose very wide tables, you end up forcing the DBMS to read very big rows, only to discard most of the data read into memory. This can slow you down even more than normalization does.
In general, don't denormalize at random. When necessary, use a design discipline that has been tested by people who went before you, even if that discipline results in some denormalization. I recommend star schema as such a discipline. It has a lot going for it. And there are still plenty of situations where a normalized design works better than a star schema design.
Learning more than one set of design principles and learning when to use which set is the second stage of learning to be an expert.