Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

seems TCP Half-Closed leads to memory leak with a bad chained proxy (with reproducer) #413

Open
Weakie opened this issue Jul 4, 2018 · 2 comments

Comments

@Weakie
Copy link

Weakie commented Jul 4, 2018

when we send a CONNECT request to a bad upstream chained proxy (those chained proxy respond nothing while connecting successfully), it seems we are getting lots of half-closed tcp connection, leads to memory leak.

here is the test code:

    public static void main(String[] args) {
        HttpProxyServer server = DefaultHttpProxyServer
                .bootstrap()
                .withPort(8080)
                .withChainProxyManager(new ChainedProxyManager() {
                    public void lookupChainedProxies(HttpRequest httpRequest,
                                                     Queue<ChainedProxy> chainedProxies) {
                        ChainedProxy chainedProxy = new ChainedProxyAdapter(){
                            public InetSocketAddress getChainedProxyAddress() {
                                return new InetSocketAddress("127.0.0.1", 8888);
                            }
                        };
                        chainedProxies.add(chainedProxy);
                    }
                })
                .withIdleConnectionTimeout(10)
                .start();
    }
  1. Run the code above;
  2. Run in terminal to simulate a bad chained proxy (respond nothing while connecting successfully):for i in $(seq 1 100); do nc -l 8888; done;
  3. Repeat run in terminal to send a HTTPS request: curl -x 127.0.0.1:8080 https://www.xxx.com
  4. Wait about 10 seconds and Run in terminal netstat -atn|grep 8080, you will see lots of CLOSE_WAIT connection, and it never been recycle (LittleProxy never close those sockets),which leads to memory leak。
$netstat -atn|grep 8080
tcp4       0      0  127.0.0.1.8080         127.0.0.1.58718        CLOSE_WAIT
tcp4       0      0  127.0.0.1.8080         127.0.0.1.58716        CLOSE_WAIT
tcp4       0      0  127.0.0.1.8080         127.0.0.1.58709        CLOSE_WAIT
tcp4       0      0  127.0.0.1.8080         127.0.0.1.58695        CLOSE_WAIT
tcp4       0      0  127.0.0.1.8080         127.0.0.1.58685        CLOSE_WAIT
tcp4       0      0  127.0.0.1.8080         *.*                    LISTEN

And I have investigated the code, here is my analyses :
1.ConnectionFlow.start() method call ClientToProxyConnection.serverConnectionFlowStarted() to set the channel.setAutoRead(false).
2. Because the chained proxy respond nothing, Trigger ProxyToServerConnection.timedOut() and close the proxyToServer connection. and call ClientToProxyConnection.timedOut(ProxyToServerConnection serverConnection) which will send GATEWAY_TIMEOUT message to client.
3.Then Client close the connection, but because of STEP1 disable channel.autoRead, the proxy never receive the client's close event. In the meanwhile, the if condition in ClientToProxyConnection.timedOut()method always false (means client channel never closed).

@Override
    protected void timedOut() {
        // idle timeout fired on the client channel. if we aren't waiting on a response from a server, hang up
        if (currentServerConnection == null || this.lastReadTime <= currentServerConnection.lastReadTime) {
            super.timedOut();
        }
    }

Could you please confirm this? Thanks in advance!

@athulyaraj
Copy link

we are also facing this issue in our service which uses littleproxy.

@arno-pons
Copy link

Hi,

Same problem here.
The application does not close the connections and the CLOSE_WAIT TCP status accumulates on the server.

Is there any chance of getting a fix?
I'm not sure how to force a timeout on the ClientToProxyConnection side?

Thank you in advance if you have a feedback or an idea?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants