I have this script which should populate a List<> with supported screen resolutions. It works when I haven't checked "Maximize On Play".. why is that? I've tried to add Debug.Log for each resolution in the foreach-loop, but the Debug.Log doesn't output anything if Maximize On Play is checked, however it does when I haven't checked it.. Anyone with the same problem and knows a "quick fix" for this? I'm using Unity 2017.1.0b2 Code (CSharp): using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.UI; public class ResolutionsOptions : MonoBehaviour { public UISelectField m_SelectField; private List<string> m_Resolutions = new List<string>(); private void Start() { Resolution[] resolutions = Screen.resolutions; foreach(Resolution res in resolutions) { m_Resolutions.Add(res.width + "x" + res.height); Debug.Log(res.width + "x" + res.height); } m_SelectField.options = m_Resolutions; } } (I have reported a bug, but not sure this is related to the editor or if I'm doing something wrong...)?
If it works without Maximize on play, and you make a quick build and it works in general, I couldn't say if it's a bug or by design, but at least you'd know that it's just a small quirk to remember when using that Maybe the resolution options are disabled in that mode or something.
Resolutions reports weird things in the editor. It's a quirk to know about an live with. Most resolution things need to be done in builds on the target device.