In the book I am using Introduction to the Design & Analysis of Algorithms, dynamic programming is said to focus on the Principle of Optimality
The key distinction is that greedy algorithms compose solutions "statically" in the sense that each local choice in the solution can be finalized without needing to know anything about the other local choices made. Dynamic algorithms, however, create sets of possible solutions to sub-problems and only generate a single solution to the global problem when all sub-problems have been considered. The Wikipedia page on greedy algorithms puts it well:
The choice made by a greedy algorithm may depend on choices made so far but not on future choices or all the solutions to the subproblem. It iteratively makes one greedy choice after another, reducing each given problem into a smaller one. In other words, a greedy algorithm never reconsiders its choices. This is the main difference from dynamic programming, which is exhaustive and is guaranteed to find the solution. After every stage, dynamic programming makes decisions based on all the decisions made in the previous stage, and may reconsider the previous stage's algorithmic path to solution.