Crash using PyTools 2 - how to provide more info?

Mar 20, 2013 at 8:54 PM
Just got a copy of PyTools 2 today and quickly noticed that it's crashing. When I first start using VS with Pytools, all works well. After a short while (anywhere from 1 to ~20 minutes), I notice VS hanging frequently. When I take a look in task manager, devenv.exe is utilizing 1 full CPU's and the memory (private working set) just keeps climbing. If I wait, it'll just keep growing and then it crashes.

I did look at the xml dump that VS generates and I noticed it said out of memory.

Any thoughts on how I can provide you more information on this? I will keep a lookout by looking at task manager, but was hoping there was some log I could look at or use some utility..


Mar 20, 2013 at 9:00 PM
It sounds like the analysis of your project is getting out of hand. This is a somewhat known/expected issue with the alpha that we'll be fixing for beta.

I suspect that you're either editing files inside the standard library or that you have a large library as a search path - can you confirm this? If not then we may be seeing an unrelated issue and we'll have to investigate further.

The workaround for alpha is to create an actual project and not include too many files (we should be able to handle a few hundred files without issue, but thousands are pushing it, depending on your hardware). In beta we will automatically degrade the IntelliSense quality to avoid using too much memory.
Mar 21, 2013 at 1:29 AM
Zooba - I'm not modifying the standard library nor do I have a large library in the search path.. At least, not thousands of files.. I did select to generate Intellisense for the standard library via tools/options, Python Tools/Interpreter Options. Could that be the cause?

As a side note, while I don't have many files, I do have a few large .py files in my project, around 6000 lines of code each. Could that affect things?
Mar 21, 2013 at 4:06 PM
Generating the database for the standard library through that dialog will work fine - it runs in a separate process and has slightly different settings to be able to handle very large projects.

It's possible that some of the code constructs in your project aren't handled by our analyzer very well. Are you able/willing to share your code with us so we can use our instrumented analyzer to find out what we're not handling? You can email directly to if you like.

Things that may lead to excessive memory allocation are classes or functions defined within functions, especially if they get called from a lot of different places, and list and dictionary literals (oddly enough, using x = list() works out better than x = [], though I wouldn't suggest changing your code to suit us - the latter is much more Pythonic).