Help - Search - Members - Calendar
Full Version: Multiple file string replacer
Linuxhelp > Support > Programming in Linux
Hello
I have a partially working offline copy of a website. However, some of the files I downloaded by hand and the HTML files point to the server instead of a relative path.

What I want to do is to find all files containing the ftp://server.com string and replace it with ../../.. to point to the files on my hard drive. Then I want to check if all the links are working.
So far, I have been partially successful. I have found linkchecker, which seems to be the perfect thing for what I want. I also found scripts to find and replace strings.

CODE
cd deploy
for y in *
do
  sed 's_OLD.VALUE_NEW.VALUE_' "$y" >temp
  if cmp temp "$y" >/dev/null
  then
    rm temp
  else
    mv temp "$y"
  fi
done


This did not work (cmp: ~/foo is a directory!) because the website copy is full of folders with HTML files in them.

CODE
grep -ilr ‘old-word’ * | xargs -i@ sed -i ‘s/old-word/new-word/g’ @


This did not work because I use DSL and apparently xargs shipped with it does not know the option -i. Neither -i nor -I is identified as a proper option.

So, my question is:
Can I put grep in the "for do" loop?
Will this work?
CODE
for y in grep -ilr 'ftp://server.com'
do
  sed 's_ftp:////server.com_..//..//.._' "$y" >temp
  if cmp temp "$y" >/dev/null
  then
    rm temp
  else
    mv temp "$y"
  fi
done
Hello
Found the (ugly) solution:


CODE
#! /bin/bash
cd directory
for file in $(grep -ilr 'ftp://foo.net' *)
do
  sed 's_ftp://foo.net_../../.._g' "$file" >temp
  if cmp temp "$file" >/dev/null
  then
    rm temp
  else
    mv temp "$file"
  fi
done
exit 0
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2017 Invision Power Services, Inc.