微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

c – 将立方体贴图转换为Equirectangular全景图

我想从立方体贴图[figure1]转换为equirectangular全景[figure2].

图1

Figure1


figure2

enter image description here

有可能从球形到立方体(通过以下方式:Convert 2:1 equirectangular panorama to cube map),但在如何反转它时失败了.

图2将使用Unity渲染为球体.

解决方法:

假设输入图像采用以下立方体贴图格式:

Cubemap image

目标是将图像投影到equirectangular格式,如下所示:

Equirectangular image

转换算法相当简单.
在给定具有6个面的立方体贴图的情况下,为了计算equirectangular图像中每个像素的颜色的最佳估计:

>首先,计算与每个像素对应的极坐标
球形图像.
>其次,使用极坐标形成矢量并确定
立方体贴图的哪个面以及面向矢量的哪个像素
谎言;就像从立方体中心发出的光线投射一样
它的侧面和那边的特定点.

请记住,在立方图的特定面上给定标准化坐标(u,v)的情况下,有多种方法可以估算等距矩形图像中像素的颜色.最基本的方法是非常原始的近似,并且为了简单起见将在本答案中使用,是将坐标舍入到特定像素并使用该像素.其他更高级的方法可以计算几个相邻像素的平均值.

算法的实现将根据上下文而变化.我在Unity3D C#中进行了快速实现,演示了如何在真实场景中实现该算法.它运行在cpu上,还有很大的改进空间,但很容易理解.

using UnityEngine;

public static class CubemapConverter
{
    public static byte[] ConvertToEquirectangular(Texture2D sourceTexture, int outputWidth, int outputHeight)
    {
        Texture2D equiTexture = new Texture2D(outputWidth, outputHeight, TextureFormat.ARGB32, false);
        float u, v; //normalised texture coordinates, from 0 to 1, starting at lower left corner
        float phi, theta; //Polar coordinates
        int cubeFaceWidth, cubeFaceHeight;

        cubeFaceWidth = sourceTexture.width / 4; //4 horizontal faces
        cubeFaceHeight = sourceTexture.height / 3; //3 vertical faces


        for (int j = 0; j < equiTexture.height; j++)
        {
            //Rows start from the bottom
            v = 1 - ((float)j / equiTexture.height);
            theta = v * Mathf.PI;

            for (int i = 0; i < equiTexture.width; i++)
            {
                //Columns start from the left
                u = ((float)i / equiTexture.width);
                phi = u * 2 * Mathf.PI;

                float x, y, z; //Unit vector
                x = Mathf.Sin(phi) * Mathf.Sin(theta) * -1;
                y = Mathf.Cos(theta);
                z = Mathf.Cos(phi) * Mathf.Sin(theta) * -1;

                float xa, ya, za;
                float a;

                a = Mathf.Max(new float[3] { Mathf.Abs(x), Mathf.Abs(y), Mathf.Abs(z) });

                //Vector Parallel to the unit vector that lies on one of the cube faces
                xa = x / a;
                ya = y / a;
                za = z / a;

                Color color;
                int xPixel, yPixel;
                int xOffset, yOffset;

                if (xa == 1)
                {
                    //Right
                    xPixel = (int)((((za + 1f) / 2f) - 1f) * cubeFaceWidth);
                    xOffset = 2 * cubeFaceWidth; //Offset
                    yPixel = (int)((((ya + 1f) / 2f)) * cubeFaceHeight);
                    yOffset = cubeFaceHeight; //Offset
                }
                else if (xa == -1)
                {
                    //Left
                    xPixel = (int)((((za + 1f) / 2f)) * cubeFaceWidth);
                    xOffset = 0;
                    yPixel = (int)((((ya + 1f) / 2f)) * cubeFaceHeight);
                    yOffset = cubeFaceHeight;
                }
                else if (ya == 1)
                {
                    //Up
                    xPixel = (int)((((xa + 1f) / 2f)) * cubeFaceWidth);
                    xOffset = cubeFaceWidth;
                    yPixel = (int)((((za + 1f) / 2f) - 1f) * cubeFaceHeight);
                    yOffset = 2 * cubeFaceHeight;
                }
                else if (ya == -1)
                {
                    //Down
                    xPixel = (int)((((xa + 1f) / 2f)) * cubeFaceWidth);
                    xOffset = cubeFaceWidth;
                    yPixel = (int)((((za + 1f) / 2f)) * cubeFaceHeight);
                    yOffset = 0;
                }
                else if (za == 1)
                {
                    //Front
                    xPixel = (int)((((xa + 1f) / 2f)) * cubeFaceWidth);
                    xOffset = cubeFaceWidth;
                    yPixel = (int)((((ya + 1f) / 2f)) * cubeFaceHeight);
                    yOffset = cubeFaceHeight;
                }
                else if (za == -1)
                {
                    //Back
                    xPixel = (int)((((xa + 1f) / 2f) - 1f) * cubeFaceWidth);
                    xOffset = 3 * cubeFaceWidth;
                    yPixel = (int)((((ya + 1f) / 2f)) * cubeFaceHeight);
                    yOffset = cubeFaceHeight;
                }
                else
                {
                    Debug.LogWarning("UnkNown face, something went wrong");
                    xPixel = 0;
                    yPixel = 0;
                    xOffset = 0;
                    yOffset = 0;
                }

                xPixel = Mathf.Abs(xPixel);
                yPixel = Mathf.Abs(yPixel);

                xPixel += xOffset;
                yPixel += yOffset;

                color = sourceTexture.GetPixel(xPixel, yPixel);
                equiTexture.SetPixel(i, j, color);
            }
        }

        equiTexture.Apply();
        var bytes = equiTexture.EncodetoPNG();
        Object.DestroyImmediate(equiTexture);

        return bytes;
    }
}

为了利用GPU,我创建了一个执行相同转换的着色器.它比在cpu上逐像素地运行转换要快得多,但不幸的是Unity对立方体贴图施加了分辨率限制,因此在使用高分辨率输入图像的情况下它的有用性受到限制.

Shader "Conversion/CubemapToEquirectangular" {
  Properties {
        _MainTex ("Cubemap (RGB)", CUBE) = "" {}
    }

    Subshader {
        Pass {
            ZTest Always Cull Off ZWrite Off
            Fog { Mode off }      

            CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #pragma fragmentoption ARB_precision_hint_fastest
                //#pragma fragmentoption ARB_precision_hint_nicest
                #include "UnityCG.cginc"

                #define PI    3.141592653589793
                #define TWOPI 6.283185307179587

                struct v2f {
                    float4 pos : POSITION;
                    float2 uv : TEXCOORD0;
                };

                samplerCUBE _MainTex;

                v2f vert( appdata_img v )
                {
                    v2f o;
                    o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
                    o.uv = v.texcoord.xy * float2(TWOPI, PI);
                    return o;
                }

                fixed4 frag(v2f i) : COLOR 
                {
                    float theta = i.uv.y;
                    float phi = i.uv.x;
                    float3 unit = float3(0,0,0);

                    unit.x = sin(phi) * sin(theta) * -1;
                    unit.y = cos(theta) * -1;
                    unit.z = cos(phi) * sin(theta) * -1;

                    return texCUBE(_MainTex, unit);
                }
            ENDCG
        }
    }
    Fallback Off
}

通过采用更复杂的方法来估计转换期间像素的颜色或通过后处理所得到的图像(或实际上两者),可以极大地改善所得图像的质量.例如,可以生成更大尺寸的图像以应用模糊滤波器,然后将其下采样到期望的大小.

我用两个编辑器向导创建了一个简单的Unity项目,该向导展示了如何正确使用C#代码或上面显示的着色器.在这里得到它:
https://github.com/Mapiarz/CubemapToEquirectangular

请记住在Unity中为输入图像设置正确的导入设置:

>点过滤
> Truecolor格式
>禁用mipmap
> 2的非幂:无(仅适用于2D纹理)
>启用读/写(仅适用于2D纹理)

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐