Help - Search - Members - Calendar
Full Version: Retreiving data from a 60G text file
Linuxhelp > Support > Technical Support
I have a large text file which is about 60gig in size. It's an SQL dump created by mysqldump.

I need to restore data from one of the tables.

My question is, how do I get just that segment of data out of the file?

I tried a ruby script which I found from googling around. But when I ran it, it just slowly ate up all of the memory and crashed the server.

I need to get everything between a line that says "CREATE TABLE `table_needed`" and "CREATE TABLE `next_table`

I think csplit would work, but I have no idea how to use it.

Any help appreciated.

Try awk
awk "/CREATE TABLE `table_needed`/,/CREATE TABLE `next_table`/" mysql_data_file > output.dat

You will need to edit the file to remove the last line.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2018 Invision Power Services, Inc.