Moderators: Despellanion, Dr. Best

Quick question

Gandalf20000
Forum God
Joined: March 4th, 2008, 9:19 pm

July 13th, 2013, 12:30 am #1

[+] Spoiler
I'm working in Unity for my Android game with my brother. I can get the program to detect every single finger input for a multitouch screen, and it will process every single point and report its position correctly. It draws the GUI correctly using the same rectangles I'm testing the touches against. I must be doing something wrong, though, because my rectangle code won't return true at all, and so none of the buttons will return "true" when I press them. Is there something obvious that I'm missing?

Here is my code:

Code: Select all

void OnGUI()
{
#if UNITY_ANDROID
pressedTop=pressedDown=pressedLeft=pressedRight=pressedFire=false;

GUI.DrawTexture(left,dpad);
GUI.DrawTexture(right,dpad);
GUI.DrawTexture(down,dpad);
GUI.DrawTexture(top,dpad);
GUI.DrawTexture(fire,action);

for(int i=0;i<Input.touchCount;i++)
{
GUI.TextField(new Rect(0,64f+(i*18f),128f,18f),"Touch at "+Input.GetTouch (i).position.ToString());
}

horizontal=0.0f;
vertical=0.0f;

for(int i=0;i<Input.touchCount;i++)
{
if(InsideRectangle (Input.GetTouch (i).position,left) && pressedLeft==false)
{
horizontal-=1.0f;
pressedLeft=true;
}
else if(InsideRectangle (Input.GetTouch (i).position,right) && pressedRight==false)
{
horizontal+=1.0f;
pressedRight=true;
}
else if(InsideRectangle (Input.GetTouch (i).position,top) && pressedTop==false)
{
vertical+=1.0f;
pressedTop=true;
}
else if(InsideRectangle (Input.GetTouch (i).position,down) && pressedDown==false)
{
vertical-=1.0f;
pressedDown=true;
}
else if(InsideRectangle (Input.GetTouch (i).position,fire) && pressedFire==false)
{
if(Input.GetTouch(i).phase==TouchPhase.Began)
{
Rigidbody b=Instantiate (bullet,transform.position+transform.rotation*Vector3.forward*8.5f,transform.rotation) as Rigidbody;
b.AddForce (rigidbody.velocity);
}
pressedFire=true;
}
GUI.TextField (new Rect(Screen.width-128,18*i,128,18),"Touch "+i+" processed.");
}

GUI.TextField (new Rect(0f,244f,128f,90f),"Left: "+pressedLeft+"\n"+
  "Right: "+pressedRight+"\n"+
  "Up: "+pressedTop+"\n"+
  "Down: "+pressedDown+"\n"+
  "Fire: "+pressedFire+"\n");
#endif
}

bool InsideRectangle(Vector2 position,Rect rectangle)
{
if(position.x>rectangle.x+rectangle.width)
{
return false;
}
if(position.x<rectangle.x)
{
return false;
}
if(position.y>rectangle.y+rectangle.height)
{
return false;
}
if(position.y<rectangle.y)
{
return false;
}
return true;
}
I realize it's not commented, but it should be fairly straightforward C# code. A quick rundown of what it should do is reset button detections to false, draw the buttons, list out the positions of every screen touch, reset the horizontal and vertical multipliers, and then test each button against a touch until all touches have been processed. This is done by taking the touch position and comparing it to the bounds of the rectangle surrounding the button. At the end of each loop, it prints out that a touch has been processed. After that, it prints out the status of each button, which should read "true" if a button has been pressed. All the traces work just fine, but my logic in the rectangle test (whether in the for loop or the actual function) must be failing.

Note: I understand the code above uses Unity, but it the problem is logic related, not Unity related.
EDIT: Ignore this topic. Apparently the y-coordinate of the touch locations are flipped with respect to how the GUI is drawn.
Quote
Like
Share