I'm new with Neoload so I apologize in advance. I'm not sure how NeoLoad works with the recognition of gestures in a mobile app. Maybe I'm not understanding the concepts... but when I record a native app session (a calculator app e.g.) Neoload doesn't seem to generate any action when I just press a buttón on the calc. ¿Am I confusing it with an automation tool?
The goal of NeoLoad is to check if the server can handle the load generated by all the devices that connects to the server: "can my server handle 1000 concurrent users?" . So, it records the HTTP traffic between the device and the server, and play it back for the tested number of users. For this need, you don't need to take care about the user interaction (click, tap, swipe...), only the traffic matters.
If you want to validate GUI related metrics, then you need to use a GUI based automation tool. NeoLoad integrates with functional testing tools to answer the following question "Are my performance goals met on the device (Tap + network + server processing + client-side rendering) when my server is loaded?". The functional tool regularly measure the end user experience and sends the measurements to NeoLoad while NeoLoad generates the load at HTTP level.