Linux Help
guides forums blogs
Home Desktops Distributions ISO Images Logos Newbies Reviews Software Support & Resources Linuxhelp Wiki

Welcome Guest ( Log In | Register )

Advanced DNS Management
New ZoneEdit. New Managment.


Sign Up Now
Reply to this topicStart new topic
> Retreiving data from a 60G text file
post Feb 9 2008, 09:08 PM
Post #1

Whats this Lie-nix Thing?

Group: Members
Posts: 1
Joined: 9-February 08
Member No.: 13,318

I have a large text file which is about 60gig in size. It's an SQL dump created by mysqldump.

I need to restore data from one of the tables.

My question is, how do I get just that segment of data out of the file?

I tried a ruby script which I found from googling around. But when I ran it, it just slowly ate up all of the memory and crashed the server.

I need to get everything between a line that says "CREATE TABLE `table_needed`" and "CREATE TABLE `next_table`

I think csplit would work, but I have no idea how to use it.

Any help appreciated.

Go to the top of the page
+Quote Post
post Feb 10 2008, 09:15 AM
Post #2

Its GNU/

Group: Support Specialist
Posts: 1,807
Joined: 23-January 03
Member No.: 360

Try awk
awk "/CREATE TABLE `table_needed`/,/CREATE TABLE `next_table`/" mysql_data_file > output.dat

You will need to edit the file to remove the last line.
Go to the top of the page
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:


RSS Lo-Fi Version Time is now: 23rd June 2018 - 07:05 AM