1
Vote

Extremely high memory usage causing UI laggyness and subsequent crash.

description

Using Visual Studio 2010 and the latest dev build of PTVS 2.1, after about 30 mins to an hour of working the UI starts getting really sluggish, and I notice that the memory usage has gone up near the 3GB mark. A few minutes after this starts happening, I will inevitably crash due to a System.OutOfMemoryException.

My setup is slightly weird, since I'm using PTVS to integrate with Maya 2013. I couldn't for the life of me get it to find the stdlib inside the python26.zip and still have access to the site-packages located elsewhere in the Maya directory, so instead I took a clean install of Python 2.6.4, dropped the special "completion" versions of the maya and pymel modules (these versions have all the Maya Python API DLL functions as stubs with documentation, but they do not actually do anything) and some other Maya Python modules into its Lib/site-packages folder, and pointed the Library Path at this directory.

With this setup, I do get Intellisense while writing code, and I can debug scripts running inside Maya's integrated interpreter without any problems whatsoever! Other than the extreme memory usage leading to a crash, everything's working perfectly.

I've attached a copy of the diagnostic info output. If you guys think it may be caused by some specific file that PTVS is choking on, next time it starts getting sluggish I'll attach a debugger to devenv.exe and see if I can't figure out exactly what that file may be. Otherwise, if you guys have other ideas I'm all ears! I'm really enjoying using PTVS to edit Maya scripts and such, and I'd like to get the rest of the company set up to be able to do so as well.

Thanks!
  • Dylan

file attachments

comments

Zooba wrote Aug 6 at 9:32 PM

Thanks for the report. We've heard a few reports like this, but haven't been able to track down what the specific problem is.

I've had a look through the diagnostic info and the crashes look similar to having an incorrect search path or too many files in your project (typically this has to be in the tens of thousands), but it doesn't appear you have that problem.

It sounds like a good setup for supporting Maya - unfortunately we still haven't implementing reading ZIP files of Python code. Another alternative you could try is turning their python26.zip file into a python26.zip directory containing the contents of the file (requires a bit of a rename dance, but Python doesn't care whether it's a directory or a ZIP file).

While you're using, would you be able to keep an eye on memory usage, especially how it relates to editing? We've seen some issues that keep leaking memory every time you edit, and they largely come down to "interesting" interactions between separate pieces of code. If memory usage is jumping up every time you make an edit - and never seeming to come down - this may be the problem. In that case, we'd love to be able to look at your code or we can provide some more things to try if that isn't possible.

dbarrie wrote Aug 7 at 1:31 AM

Unfortunately, I can't give out my company's Maya scripts (not only are there thousands, but it's all heavily NDAed), so it won't be easy for you to be able to reproduce my development environment locally.

I'm actually somewhat familiar with VS's internals, so if I can get a pdb for the version I have installed (I'd rather not have to build it all from scratch, if possible), and if you have any idea WHERE in the code base the problems likely are, I wouldn't mind running VS in a debugger and taking a look.

Zooba wrote Aug 7 at 2:59 AM

Yeah, I suspected that would be the case. Unfortunately, the problem is not easily debuggable, and you would be better off trying to isolate the problem files.

To give an example, we saw a serious memory issue with the nltk library (when it's on a search path or added directly to the project - site-packages is protected from these somewhat). Eventually, I narrowed the problem down to a mutual recursion relationship between generators and instance variables in two separate modules, but I still haven't been able to figure out exactly why the analyzer doesn't handle it well (though I have been able to mitigate it somewhat by reducing the size of the leak). It's likely that there's some interaction between two or more modules in your code as well if memory only increases as you edit.

If you want an easier way to test this, there's a command line tool in our project called AnalysisMemoryTester that you should be able to build on its own (if PTVS is installed). It will let you script the analyzer outside of VS and take process dumps, which are a good proxy for usage even if you don't open them in VS (VS 2013 Update 3 is especially good for this).

All our symbols should be on the Microsoft symbol server and can be downloaded automatically, but if it's not working I can put them up as a zip file. But I think you'll have more luck looking at memory usage and there's not much to discover with the debugger.

Happy to help more if you have any questions.

Zooba wrote Aug 7 at 3:14 AM

Here's a short example script for the tester to save you having to reverse engineer how it works:
# specify language version
python 2.6

# path to a completion DB - relative to %LocalAppData%
# this value is correct for your Maya environment
db Python Tools\CompletionDB\10.0\ae7b8b20-37d4-4d13-9af7-a7323c156e1d\2.6

#module <module name> <filename> loads the module into memory
module setup setup.py
module nltk nltk\__init__.py
module nltk.align nltk\align.py
module nltk.util nltk\util.py

# very simple repeat <integer> for looping
repeat 10

print Running analysis
# enqueue <module name> (or * for all known modules) enqueues the module
# into the analyzer. This is roughly equivalent to modifying the file in the editor
# and waiting for analysis to restart
enqueue *

# analyze runs the analysis and waits
analyze

end # of the repeat above

# full garbage collection
gc

# full process dump
dump dump0.dmp
Of course, you can also modify that tester however you like, it's relatively simple.