I\'m looking for an algorithm to fit a bounding box inside a viewport (in my case a DirectX scene). I know about algorithms for centering a bounding sphere in a orthographic
Since you have a bounding box, you should have a basis describing it's orientation. It seems that you want to position the camera on the line coincident with the basis vector describing the smallest dimension of the box, then roll the camera so that the largest dimension is horizontal (assuming you have OBB and not AABB). This assumes that the aspect ratio is greater than 1.0; if not you'll want to use the vertical dimension.
What I would attempt:
boxWidth / (2 * tan(horizontalFov / 2))
. Note that boxWidth
is the width of the largest dimension of the box.boxCenter + scaledBasis
looking at the boxCenter
.Edit:
So I think what you're getting at is that you have the camera at an arbitrary position looking somewhere, and you have an AABB at another position. Without moving the camera to face a side of the box, you want to:
If this is the case you'll have a bit more work; here's what I suggest:
Unproject
two opposing corners of the screen space bounding box into world space. For a Z value use the closest world space points of your AABB to the camera.