So here is some interesting stuff, inside of "tk2dUIManager.cs" I put in two debug logs, inside of:
tk2dUIItem RaycastForUIItem( Vector2 screenPos ) {
int cameraCount = sortedCameras.Count;
for (int i = 0; i < cameraCount; ++i) {
tk2dUICamera currCamera = sortedCameras[i];
if (currCamera.RaycastType == tk2dUICamera.tk2dRaycastType.Physics3D) {
Debug.Log("Input mouse position: " + Input.mousePosition);
Debug.Log ("Ray touch position: " + screenPos);
...
And here are the results, when running the "UI Demo" itself on the Wii U gamepad:
Top LEFT Corner (touching as far top right as possible with the stylus):
Input mouse position: (8.7, 472.0, 0.0)
Ray touch position: (8.7, 8.0)
Top RIGHT Corner:
Input mouse position: (846.0, 472.0, 0.0)
Ray touch position: (846.0, 8.0)
Bottom LEFT Corner:
Input mouse position: (8.0, 8.0, 0.0)
Ray touch position: (8.0, 472.0)
Bottom RIGHT Corner:
Input mouse position: (846.0, 8.0, 0.0)
Ray touch position: (846.0, 472.0)
Tapping the "Up Down Button" itself directly in the middle:
Input mouse position: (291.6, 321.3, 0.0)
Ray touch position: (292.2, 159.3)
** Note: The button did NOT respond to this touch
Tapping the complete opposite area of the "Up Down Button" on the Y axis (slightly above the "On" button) to show that the offset of the button is down there, doing this DID make the "Up Down Button" respond to the touch and it did register it:
Input mouse position: (272.2, 173.3, 0.0)
Ray touch position: (272.2, 306.7)
Hopefully this all makes sense, but you can see that there is a difference in the Y axis between the touches of what is considered the "mousePosition" versus the finger touch within ray casting.
I'm going to keep digging but hopefully this rings some bells.
Thank you!