![]() |
![]() Advanced DNS Management |
New ZoneEdit. New Managment. FREE DNS Is Back Sign Up Now |
![]() |
![]()
Post
#1
|
|
Whats this Lie-nix Thing? ![]() Group: Members Posts: 2 Joined: 4-December 10 Member No.: 14,804 ![]() |
I have a partially working offline copy of a website. However, some of the files I downloaded by hand and the HTML files point to the server instead of a relative path.
What I want to do is to find all files containing the ftp://server.com string and replace it with ../../.. to point to the files on my hard drive. Then I want to check if all the links are working. So far, I have been partially successful. I have found linkchecker, which seems to be the perfect thing for what I want. I also found scripts to find and replace strings. CODE cd deploy for y in * do sed 's_OLD.VALUE_NEW.VALUE_' "$y" >temp if cmp temp "$y" >/dev/null then rm temp else mv temp "$y" fi done This did not work (cmp: ~/foo is a directory!) because the website copy is full of folders with HTML files in them. CODE grep -ilr ‘old-word’ * | xargs -i@ sed -i ‘s/old-word/new-word/g’ @ This did not work because I use DSL and apparently xargs shipped with it does not know the option -i. Neither -i nor -I is identified as a proper option. So, my question is: Can I put grep in the "for do" loop? Will this work? CODE for y in grep -ilr 'ftp://server.com'
do sed 's_ftp:////server.com_..//..//.._' "$y" >temp if cmp temp "$y" >/dev/null then rm temp else mv temp "$y" fi done |
|
|
![]() |
![]()
Post
#2
|
|
Whats this Lie-nix Thing? ![]() Group: Members Posts: 2 Joined: 4-December 10 Member No.: 14,804 ![]() |
Found the (ugly) solution:
CODE #! /bin/bash
cd directory for file in $(grep -ilr 'ftp://foo.net' *) do sed 's_ftp://foo.net_../../.._g' "$file" >temp if cmp temp "$file" >/dev/null then rm temp else mv temp "$file" fi done exit 0 |
|
|
![]() ![]() |
![]() |
Lo-Fi Version | Time is now: 23rd April 2018 - 12:23 AM |