For instance i connected 30.000 dummy clients to my server, so the memory
consumption went up to ~ 50% of my available memory. Then i disconnected
them (leading to a lot of destructed objects), which should lead to a lot of
freed memory. But my application still claims 50% of my RAM, according to
RSS. If i now connect another 30.000 clients, my RSS changes only
marginally. But i cannot understand, why my application never releases any
of the claimed memory to the OS.
It feels like there is some kind of a
capacity inside the application (a map-like container for instance), which
can only increase and will never decrease. Since its capacity does never
decrease, the claimed memory won't be released.