Asked • 03/18/19

Large, persistent DataFrame in pandas?

I am exploring switching to python and pandas as a long-time SAS user. However, when running some tests today, I was surprised that python ran out of memory when trying to `pandas.read_csv()` a 128mb csv file. It had about 200,000 rows and 200 columns of mostly numeric data. With SAS, I can import a csv file into a SAS dataset and it can be as large as my hard drive. Is there something analogous in `pandas`? I regularly work with large files and do not have access to a distributed computing network.

1 Expert Answer

By:

Xinlong C. answered • 04/04/26

Tutor
New to Wyzant

Yale Data Scientist | Python, SQL & ML Tutor

Still looking for help? Get the right answer, fast.

Ask a question for free

Get a free answer to a quick problem.
Most questions answered within 4 hours.

OR

Find an Online Tutor Now

Choose an expert and meet online. No packages or subscriptions, pay only for the time you need.