This is a follow-on of sorts to bug [#7356](https://github.com/mono/mono/issues/…7356) . The test is essentially the same thing except through a WebProxy.
## Steps to Reproduce
1. Use Ubuntu 16.04. Install mono-devel, mono-utils. mono-profiler, whatever is necessary to have the compiler and profiler. I doubt this is specific to the Linux version, however. You will also need a proxy server through which to make the web requests. Replace **yourproxyip** and **yourproxyport** withthe proper values for your proxy server.
2. The program below simply loops over a set of popular websites and gets their homepages through a proxy server, using HttpWebRequest and HttpWebResponse. Compile and run it under the profiler, e.g.
```
mono --gc=sgen --profile=log:heapshot=50gc TestMemGrow.exe
```
3. Let it run for several hours (3-4, or more). After that, to view the profiler output enter:
```
mprof-report output.mlpd > out.txt
grep System.Byte out.txt
(grep -F "System.String" | grep -Fv "System.String[]") < out.txt
```
You should see lists of allocations of System.Byte[] and System.String that grow increasingly larger. You can leave the program running and re-execute those commands, to see the continued growth of memory usage as the profiler adds more heap shots to its data file. In my testing the heapshots came about 10 minutes apart.
I left in the GC calls from my prior bug, and this example writes the web page data to a file. It also writes exceptions to a file exceptions.txt, because it is possible (I really don't know) that it is the exception handling leaking memory, so that information might be useful.
To execute this code you will need a proxy server through which you make the web requests. Fill in your proxy server's IP and port in the appropriate place in the code.
**CODE:**
```C#
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Runtime;
namespace TestMemGrow
{
class Program
{
static void Main()
{
List<string> websites = new List<string>() { "http://www.amazon.com", "http://www.google.com", "http://www.yahoo.com", "http://www.ebay.com", "http://www.overstock.com",
"http://www.marketwatch.com", "http://www.bn.com", "http://www.newegg.com", "http://www.wsj.com", "http://www.arstechnica.com", "http://www.slashdot.org",
"http://www.mediaite.com", "http://www.disqus.com", "http://www.twitter.com", "http://www.snap.com", "http://www.facebook.com", "http://www.usps.com",
"http://www.ups.com", "http://www.techcrunch.com", "http://www.oracle.com", "http://www.java.com", "http://www.apple.com", "http://www.microsoft.com",
"http://www.ibm.com", "http://www.dell.com", "http://www.asus.com", "http://www.gigabyte.com", "http://www.intel.com", "http://www.crucial.com",
"http://www.westerndigital.com", "http://www.samsung.com", "http://www.sandisk.com", "http://www.brother.com", "http://www.hp.com",
"http://www.msn.com", "http://www.disney.com", "http://www.nintendo.com", "http://www.twitter.com", "http://www.youtube.com",
"http://www.instagram.com", "http://www.linkedin.com", "http://www.wordpress.org", "http://www.pinterest.com", "http://www.wikipedia.org",
"http://www.blogspot.com", "http://www.adobe.com", "http://www.tumblr.com", "http://www.vimeo.com", "http://www.flickr.com", "http://www.godaddy.com",
"http://www.buydomains.com", "http://www.reddit.com", "http://www.w3.org","http://www.nytimes.com", "http://www.statcounter.com",
"http://www.weebly.com","http://www.blogger.com","http://www.github.com", "http://www.jimdo.com", "http://www.myspace.com",
"http://www.mozilla.org", "http://www.gravatar.com", "http://www.theguardian.com", "http://www.bluehost.com", "http://www.cnn.com", "http://www.foxnews.com",
"http://www.msnbc.com", "http://www.wix.com", "http://www.paypal.com","http://www.stumbleupon.com", "http://www.digg.com","http://www.huffingtonpost.com",
"http://www.feedburner.com", "http://www.imdb.com","http://www.yelp.com","http://www.dropbox.com", "http://www.baidu.com","http://www.washingtonpost.com",
"http://www.slideshare.net","http://www.etsy.com","http://www.telegraph.co.uk", "http://www.about.com", "http://www.bing.com", "http://www.latimes.com",
"http://www.tripadvisor.com","http://www.opera.com", "http://www.live.com", "http://www.wired.com", "http://www.bandcamp.com"};
StreamWriter sw = new StreamWriter("exceptions.txt");
while (true)
{
foreach (string website in websites)
{
try
{
GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect();
Console.WriteLine(DateTime.Now.ToString("HH:mm:ss") + " Getting " + website);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(website);
WebProxy myProxy = new WebProxy("yourproxyip", yourproxyport);
request.Proxy = myProxy;
HttpWebResponse response = request.GetResponse() as HttpWebResponse;
StreamReader reader = new StreamReader(response.GetResponseStream());
File.WriteAllText("florb.txt", reader.ReadToEnd());
reader.Close();
response.Close();
}
catch (Exception ex)
{
string exString = ex.ToString();
exString = exString.Substring(0, Math.Min(150, exString.Length));
Console.WriteLine("Caught exception . . . " + exString);
sw.WriteLine("EXCEPTION\r\n" + ex.ToString() + "\r\n");
sw.Flush();
}
}
}
}
}
}
```
### Current Behavior
Memory allocations of System.Byte[] and System.String increase apparently without limit.
### Expected Behavior
The garbage collector would clean up these allocations.
## On which platforms did you notice this
[ ] macOS
[X] Linux
[ ] Windows
**Version Used**:
```
Mono JIT compiler version 5.12.0.226 (tarball Thu May 3 09:48:32 UTC 2018)
Copyright (C) 2002-2014 Novell, Inc, Xamarin Inc and Contributors. www.mono-project.com
TLS: __thread
SIGSEGV: altstack
Notifications: epoll
Architecture: amd64
Disabled: none
Misc: softdebug
Interpreter: yes
LLVM: supported, not enabled.
GC: sgen (concurrent by default)
```
## Other Matters
I am not at all certain whether this leak is caused by the use of a WebProxy itself, or by the increased number of exceptions, particularly WebExceptions, that occur when using a WebProxy to make the requests.