When i see that a resource is taking 0.05ms is that really a 1/20th of a millisecond? That seems so small that I’m confused if I’m reading it right. How much ms is too much ms? And does that number represent the execution time of all the threads combined or just the execution time of the “main” thread?
What would be an acceptable level of RAM usage per resource? And finally, what does the “streaming” column represent?
According to the profiler (server tool), 0,05ms means, that the script is prolonging each tick with this amount of time. Resmon itself is turning a script red when it goes above 1.00ms I think (or some similar huge amount) - so I guess, it’s up to you to say how much is too much.
I just optimize scripts from the top to the bottom (from highest usage). IMHO, below 0.10ms is fine-ish.
Also the RAM doesn’t really matter in the long run, because it’s just a couple of MBs per resource. Even if you have 300 resources, it’s still not that much. You should ofc optimize the scripts in the terms of the memory.
Streaming column was not used on my 3 servers, so not really sure Never looked into that one deeper.