

Line-by-line memory usage of a Python program - vgnet
http://fseoane.net/blog/2012/line-by-line-report-of-memory-usage/

======
lclarkmichalek
Pretty sure forking a process every line is not the most efficient way to do
this :) But nether the less, very cool

~~~
lrem
There are more efficient ways, but they are platform specific. Using ps is
easier, as it's standard.

~~~
benhoyt
Not on my machine. :-)

    
    
       C:\>ps
       'ps' is not recognized as an internal or external command,
       operable program or batch file.
    

The guy why suggested using the psutil is onto something -- definitely a 3rd-
party dependency, but rolls all the platform specific messiness into a
library. And much faster and less fragile than forking a new process.

------
beambot
Cool. I didn't know that you could get a representation of a function's code
through func_name.func_code. That's pretty slick!

~~~
icebraining
Yeap. I find it great for prototyping: I code in the IPython shell, and then I
dump the code of the various functions I wrote to a file, using func_code.

    
    
        def makedumper(filen):
            def dumpfunc(func):
                with open(filen, 'a') as f:
                    file.write("\n\n{}".format(func.func_code))
            return dumpfunc
    
         #then to use it:  
         dump = makedumper("sourcefile")
         dump(func1)
         dump(func2)

------
a235
Really neat solution, useful for straightforward things. However, it's quite
far from telling us the full truth. Python won't release memory instantly and
there will be some free memory that can be reused but won't ever be reused
(int arrays?)

------
andreasvc
Another Python memory profiler: <http://guppy-pe.sourceforge.net/#Heapy>

