I am doing an performance test that several users uploading 5 document at the same time for a total of 25 documents.
However, every time when uploading a file large than 20 MB (files are chose randomly using variables), there is an error. The response code is 502.
The files that are less than 20 MB are uploaded successfully during runtime. And if I tried to upload a large file manually (go to the website to upload instead of in script, it works fine).
This 502 error is coming from a proxy server. There's no limitation in NeoLoad regarding the size of an uploaded file as soon as your load generator has enough memory.
Are you able to reproduce that issue even running a user validation?
If you record again that exact same scenario with NeoLoad when uploading such big file does it work?
I'm wondering if you have to modify some parameters in the request when the file exceeds a specific size.
Try to compare your new recording with the old one in case of different parameters on your upload request.
The only modification I did is changed the file names to variables so it will upload file randomly.
when running user validation, it will be the same error if it is uploading a large file.
There is no error when I record again with large file, but fail during validation and runtime
Is there any specific configuration for load generator memory?
My laptop has 16GB memory.
I do not think it's an issue with the amount of memory allocated to your load generator. If your machine has 16GB and you are using the local LG then it will use 4GB at maximum.
You said that you were able to record a script when uploading a big file. It's not clear if you are able or not to replay that same script as is by only handling the necessary dynamic parameters.
The goal is to not change the file that you would like to upload by a variable but upload the same one from recording.