问题
I have a game, initialized to run at 1920x1080. All sprites, Vectors etc. are specificly properly placed, to match the 1920x1080 genre.
I have an enum
, stating what res, the game is told to use. 1920x1080 would be the standard.
Is there a way to, let's say, have a resolution of 1280x960, in the way of this:
- The game window is 1280x960
- Game resolution (backbuffer) is still 1920x1080, but is scaled down to fit the window - 1280x960.
A bit like, just before the draw event, capture the screen into a Texture2D
, and display it properly scaled, to fit the game window.
回答1:
Actually there is a very simple solution to this. As Pinckerman did point out, you need to know the aspect ratio (screen ratio), but you will be using it in the SpriteBatch.Begin method, where you will pass the Vector2 scale through a Matrix.CreateScale(Vector3 Here, where Z is 1), and use it as final argument in the SpriteBatch.Begin method.
Most other arguments in this case can be left Null.
Basically what it does is, it tells the spritebatch to draw everything with an offset (the scale) applied to both position and size. This means that when, in my case, I place something on 640x400, no matter what resolution I run in the screen, it still starts off dead center of the screen. Quite useful if you ask me.
Example from my current project:
/// <summary>
/// Resolution
/// </summary>
public static class Resolution
{
private static Vector3 ScalingFactor;
private static int _preferredBackBufferWidth;
private static int _preferredBackBufferHeight;
/// <summary>
/// The virtual screen size. Default is 1280x800. See the non-existent documentation on how this works.
/// </summary>
public static Vector2 VirtualScreen = new Vector2(1280, 800);
/// <summary>
/// The screen scale
/// </summary>
public static Vector2 ScreenAspectRatio = new Vector2(1, 1);
/// <summary>
/// The scale used for beginning the SpriteBatch.
/// </summary>
public static Matrix Scale;
/// <summary>
/// The scale result of merging VirtualScreen with WindowScreen.
/// </summary>
public static Vector2 ScreenScale;
/// <summary>
/// Updates the specified graphics device to use the configured resolution.
/// </summary>
/// <param name="device">The device.</param>
/// <exception cref="System.ArgumentNullException">device</exception>
public static void Update(GraphicsDeviceManager device)
{
if (device == null) throw new ArgumentNullException("device");
//Calculate ScalingFactor
_preferredBackBufferWidth = device.PreferredBackBufferWidth;
float widthScale = _preferredBackBufferWidth / VirtualScreen.X;
_preferredBackBufferHeight = device.PreferredBackBufferHeight;
float heightScale = _preferredBackBufferHeight / VirtualScreen.Y;
ScreenScale = new Vector2(widthScale, heightScale);
ScreenAspectRatio = new Vector2(widthScale / heightScale);
ScalingFactor = new Vector3(widthScale, heightScale, 1);
Scale = Matrix.CreateScale(ScalingFactor);
device.ApplyChanges();
}
/// <summary>
/// <para>Determines the draw scaling.</para>
/// <para>Used to make the mouse scale correctly according to the virtual resolution,
/// no matter the actual resolution.</para>
/// <para>Example: 1920x1080 applied to 1280x800: new Vector2(1.5f, 1,35f)</para>
/// </summary>
/// <returns></returns>
public static Vector2 DetermineDrawScaling()
{
var x = _preferredBackBufferWidth / VirtualScreen.X;
var y = _preferredBackBufferHeight / VirtualScreen.Y;
return new Vector2(x, y);
}
}
And usage:
/// <summary>
/// Draws the game objects to the screen. Calls Root.Draw.
/// </summary>
/// <param name="gameTime">The game time.</param>
protected override void Draw(GameTime gameTime)
{
// TODO: Camera
SpriteBatch.Begin(SpriteSortMode.Immediate, null, null, null, null, null, Resolution.Scale);
Root.Draw(SpriteBatch, gameTime);
SpriteBatch.End();
base.Draw(gameTime);
}
Hope this example code is helpful.
NOTE: Doing it the way Pinckerman suggested isn't a wrong way, however I'm led to believe that apart from the obvious (initially more code, but in the long run way less), what Pinckerman suggested may also take up more performance. This is only a hunch, and it's merely based on the GPU probably being more efficient in calculations related to screen placement than the CPU.
But don't hang me up on it.
@Edit:
To get Mouse coordinates in relation to objects on the screen:
public Vector2 GetMouseCoords()
{
var screenCoords = Mouse.GetPosition(); // Or whatever it's called.
var sceneCoords = screenCoords / Resolution.ScreenScale;
return sceneCoords;
}
回答2:
I think you have to manually modify every Vector2
or Rectangle
you need making them relate to your resolution, or (if you can) use variables that are based on the Window.ClientBounds.Width
or Window.ClientBounds.Height
, as davidsbro wrote.
Something like this:
Vector2 largeResolution = new Vector2(1920, 1080);
Vector2 smallResolution = new Vector2(1280, 960);
// you should have already set your currentResolution previously
Vector2 screenRatio = currentResolution / largeResolution;
And now your initializations become:
Vector2 position = new Vector2(200, 400) * screenRatio;
Rectangle imageRect = new Rectangle((int)(100 * screenRatio.X), (int)(200 * screenRatio.Y), ... );
In this way, you only have to add that product, so if your current resolution is the large one your variables still the same, otherwise if it's the small one every variables will be scaled.
来源:https://stackoverflow.com/questions/18645032/change-and-scale-resolution-of-xna-game