I have some questions regarding the usage and significance of the synchronized keyword.
synchronized
What the other answers are missing is one important aspect: memory barriers. Thread synchronization basically consists of two parts: serialization and visibility. I advise everyone to google for "jvm memory barrier", as it is a non-trivial and extremely important topic (if you modify shared data accessed by multiple threads). Having done that, I advise looking at java.util.concurrent package's classes that help to avoid using explicit synchronization, which in turn helps keeping programs simple and efficient, maybe even preventing deadlocks.
One such example is ConcurrentLinkedDeque. Together with the command pattern it allows to create highly efficient worker threads by stuffing the commands into the concurrent queue -- no explicit synchronization needed, no deadlocks possible, no explicit sleep() necessary, just poll the queue by calling take().
In short: "memory synchronization" happens implicitly when you start a thread, a thread ends, you read a volatile variable, you unlock a monitor (leave a synchronized block/function) etc. This "synchronization" affects (in a sense "flushes") all writes done before that particular action. In the case of the aforementioned ConcurrentLinkedDeque, the documentation "says":
Memory consistency effects: As with other concurrent collections, actions in a thread prior to placing an object into a ConcurrentLinkedDeque happen-before actions subsequent to the access or removal of that element from the ConcurrentLinkedDeque in another thread.
This implicit behavior is a somewhat pernicious aspect because most Java programmers without much experience will just take a lot as given because of it. And then suddenly stumble over this thread after Java isn't doing what it is "supposed" to do in production where there is a different work load -- and it's pretty hard to test concurrency issues.