Managing Memory and Load Times in R and Python

Once I know I won’t need a file again, it’s gone. (Regular back-ups with Time Machine have saved me from my own excessive zeal at least once.) Similar economy applies to runtime: My primary computing device is my laptop, and I’m often too lazy to fire up a cloud instance unless the job would take more than a day.

Working with GDELT data for the last few weeks I’ve had to be a bit less conservative than usual. Habits are hard to break, though, so I found myself looking for a way to

  1. keep all the data on my hard-drive, and
  2. read it into memory quickly in R and/or Python.

The .zip files you can obtain from the GDELT site accomplish (1) but not (2). A .rda
binary helps with part of (2) but has the downside of being a binary file that I might not be able to open at some indeterminate point in the future–violating (1). And a memory-hogging CSV that also loads slowly is the worst option of all.

So what satisficing solution did I reach? Saving gzipped files (.gz). Both R and Python can read these files directly (R code shown below; in Python use gzip.open
or the compression option for read_csv in pandas). It’s definitely smaller–the 1979 GDELT historical backfile compresses from 115.3MB to 14.3MB (an eighth of its former size). Reading directly into R from a .gz file has been available since at least version 2.10.

Is it faster? See for yourself:

> system.time(read.csv('1979.csv', sep='\t', header=F, flush=T, as.is=T) )
user system elapsed
48.930 1.126 50.918
> system.time(read.csv('1979.csv.gz', sep='\t', header=F, flush=T, as.is=T))
user system elapsed
23.202 0.849 24.064
> system.time(load('gd1979.rda'))
user system elapsed
5.939 0.182 7.577

Compressing and decompressing .gz files is straightforward too. In the OS X Terminal, just type gzip filename or gunzip filename, respectively

Reading the gzipped file takes less than half as long as the unzipped version. It’s still nowhere near as fast as loading the rda binary, but I don’t have to worry about file readability for many years to come given the popularity of *nix operating systems. Consider using .gz files for easy memory management and quick loading in R and Python.

2 thoughts on “Managing Memory and Load Times in R and Python

  1. I have no idea what you said in this post,(I´ve read others like the one statistics very interesting) but what if you lost or broke your laptop? All that you put in the hard-drive or whatever other place you can use to store important thing is gone, no? I say this because when I write a story or work on my writing I have this torturous thought that my computer is going to fail on me and everything I work for is going to hell. Question: is there a place in the computer that if for some reason one day I try to turn it on and can´t I can retrieve my writing and other stuff?

Comments are closed.