Linux Help
guides forums blogs
Home Desktops Distributions ISO Images Logos Newbies Reviews Software Support & Resources Linuxhelp Wiki

Welcome Guest ( Log In | Register )



Advanced DNS Management
New ZoneEdit. New Managment.

FREE DNS Is Back

Sign Up Now
 
Reply to this topicStart new topic
> Retreiving data from a 60G text file
blackout
post Feb 9 2008, 09:08 PM
Post #1


Whats this Lie-nix Thing?
*

Group: Members
Posts: 1
Joined: 9-February 08
Member No.: 13,318



I have a large text file which is about 60gig in size. It's an SQL dump created by mysqldump.

I need to restore data from one of the tables.

My question is, how do I get just that segment of data out of the file?

I tried a ruby script which I found from googling around. But when I ran it, it just slowly ate up all of the memory and crashed the server.

I need to get everything between a line that says "CREATE TABLE `table_needed`" and "CREATE TABLE `next_table`

I think csplit would work, but I have no idea how to use it.

Any help appreciated.

Cheers.
Go to the top of the page
 
+Quote Post
michaelk
post Feb 10 2008, 09:15 AM
Post #2


Its GNU/Linuxhelp.net
*******

Group: Support Specialist
Posts: 1,800
Joined: 23-January 03
Member No.: 360



Try awk
awk "/CREATE TABLE `table_needed`/,/CREATE TABLE `next_table`/" mysql_data_file > output.dat

You will need to edit the file to remove the last line.
Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 13th December 2017 - 02:29 AM