texture2d

Unity: Texture2D ReadPixels for specific Display

£可爱£侵袭症+ 提交于 2021-02-10 06:21:15
问题 Unity has had support for multiple display outputs for a while now (up to 8). With the ReadPixels function, you can specify an area to read from, and an origin coordinate. But I cannot specify a display number to perform the read on. I need to be able to read pixels from a specific display (1-8) with a specific area and origin point. How can I do this, please? 回答1: You can achieve ReadPixels for a specific screen/display. You have to do the following: Before I start, I assume you have a

How to change texture format from Alpha8 to RGBA in Unity3d?

☆樱花仙子☆ 提交于 2021-01-29 17:47:06
问题 I have been trying to change the format from a camera that give a texture in Alpha8 to RGBA and have been unsuccessful so far. This is the code I've tried: public static class TextureHelperClass { public static Texture2D ChangeFormat(this Texture2D oldTexture, TextureFormat newFormat) { //Create new empty Texture Texture2D newTex = new Texture2D(2, 2, newFormat, false); //Copy old texture pixels into new one newTex.SetPixels(oldTexture.GetPixels()); //Apply newTex.Apply(); return newTex; } }

How convert type 'UnityEngine.Texture2D' to 'UnityEngine.Sprite?

北战南征 提交于 2020-06-29 03:48:16
问题 Hi i try to convert my Texture 2D in Image (and i cant use a Raw Image because the resolution dont match in phones) but the problem is that Image does not have the Texture element. how Convert UnityEngine.Texture2D in Image.Sprite. //Image Profile protected Texture2D pickedImage; public Texture2D myTexture2D; public RawImage getRawImageProfile; public RawImage getRawImageArrayProfile; public Image getRawImageProfile2; public Image getRawImageArrayProfile2; public void PickImageFromGallery(int

How to use OpenGL Array Texture?

喜你入骨 提交于 2020-05-15 11:55:28
问题 I am trying to use sprite sheet in OpenGL, implementing it through Array Texture This is how I load my texture: QImage image; image.load("C:\\QtProjects\\project\\images\\spritesheet.png", "png"); const unsigned char* data = image.bits(); int twidth = image.width(), theight = image.height(); glTexStorage3D(GL_TEXTURE_2D_ARRAY, 1, GL_RGBA, twidth / 3, theight / 4, 12); glTexSubImage3D(GL_TEXTURE_2D_ARRAY, 0, 0, 0, 0, twidth / 3, theight / 4, 12, GL_BGRA, GL_UNSIGNED_BYTE, data); glUseProgram

How to create 2D texture using DXGI format DXGI_FORMAT_R1_UNORM?

前提是你 提交于 2020-02-16 10:42:31
问题 I want to create a 1 bit per pixel monochrome texture 2D in DirectX 11 using dxgi format DXGI_FORMAT_R1_UNORM I have done trying the following but it's showing following errors: D3D11 ERROR: ID3D11Device::CreateTexture2D: Device does not support the format R1_UNORM. [ STATE_CREATION ERROR #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT] D3D11: BREAK enabled for the previous message, which was: [ ERROR STATE_CREATION #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT ] I have tried to create a texture for

How to create 2D texture using DXGI format DXGI_FORMAT_R1_UNORM?

对着背影说爱祢 提交于 2020-02-16 10:42:28
问题 I want to create a 1 bit per pixel monochrome texture 2D in DirectX 11 using dxgi format DXGI_FORMAT_R1_UNORM I have done trying the following but it's showing following errors: D3D11 ERROR: ID3D11Device::CreateTexture2D: Device does not support the format R1_UNORM. [ STATE_CREATION ERROR #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT] D3D11: BREAK enabled for the previous message, which was: [ ERROR STATE_CREATION #92: CREATETEXTURE2D_UNSUPPORTEDFORMAT ] I have tried to create a texture for