Linux Help
guides forums blogs
Home Desktops Distributions ISO Images Logos Newbies Reviews Software Support & Resources Linuxhelp Wiki

Welcome Guest ( Log In | Register )



Advanced DNS Management
New ZoneEdit. New Managment.

FREE DNS Is Back

Sign Up Now
2 Pages V   1 2 >

markjr
Posted on: Apr 8 2015, 05:10 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Zoneedit is one of the oldest managed DNS providers on the internet and recently came under new management (short version: Zoneedit was acquired by easyDNS).

We've rebuilt the control panel, completely replaced every Zoneedit nameserver (bye bye to those old unpatched Bind8 disasters!) and best of all, brought back free DNS.

With advanced DNS management premium features like failover DNS and host monitoring, it's worth taking a look at the new & improved Zoneedit.

  Forum: Technical Support · Post Preview: #33849 · Replies: 0 · Views: 5,037

markjr
Posted on: Nov 1 2010, 07:12 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


testing
  Forum: Technical Support · Post Preview: #31370 · Replies: 1 · Views: 3,327

markjr
Posted on: Oct 19 2010, 04:46 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


As a follow-up, we had a backup of the data, but somehow, the backup itself is corrupted.

We're still debugging what happened. More as it comes in. The crash came timed with a hardware problem on that server so we're trying to figure out exactly what happened.
  Forum: General Discussion · Post Preview: #31369 · Replies: 1 · Views: 10,841

markjr
Posted on: Jul 8 2009, 10:53 AM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


We're getting overrrun with spambots. Until we can upgrade IPB (expected later today) we're suspending new user signups.

Please check back later and create your account then.
  Forum: General Discussion · Post Preview: #31367 · Replies: 0 · Views: 8,044

markjr
Posted on: Jul 8 2009, 10:21 AM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Sorry Anthony, I think I pruned your account as well in a sweep of a few hundred spambot signups.
  Forum: Technical Support · Post Preview: #31366 · Replies: 1 · Views: 3,379

markjr
Posted on: Jul 8 2009, 10:20 AM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Sorry Edgar, I think I pruned your account by accident, we had a couple hundred spambot signups overnight and it looks like you got caught up in the sweep.
  Forum: Technical Support · Post Preview: #31365 · Replies: 1 · Views: 3,155

markjr
Posted on: Nov 12 2008, 04:01 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


the server we had linuxhelp on blew a hard drive and it turned out our backups were hosed (long story - when we moved servers we didn't follow with backups of the new location) - so our brave admins had to recover the data using knoppix
  Forum: General Discussion · Post Preview: #30695 · Replies: 0 · Views: 4,659

markjr
Posted on: Aug 28 2008, 09:06 AM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (ondemandemails @ Aug 23 2008, 01:26 AM) *
desperately looking for 0ve replies.


Have you tried searching the mailing list archives for the squid-users mailing list?

http://www.squid-cache.org/Support/mailing-lists.dyn
  Forum: Technical Support · Post Preview: #30397 · Replies: 1 · Views: 2,604

markjr
Posted on: Jul 17 2008, 12:35 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (dom9360 @ Jul 16 2008, 01:50 PM) *
Try to search for files called 'Spam'.

Next, nullify the file.

This is a spam file on people's email boxes. They don't empty them out. However, they periodically need to check it just in case something good goes over there.

Thanks,D



I would just cp /dev/null onto them. You probably need to use -f

CODE
find /home/virtual -type f -name "Spam" -exec "cp -f /dev/null {}" \;
  Forum: Technical Support · Post Preview: #30317 · Replies: 4 · Views: 4,302

markjr
Posted on: May 24 2008, 03:37 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (pondwaterboy @ May 24 2008, 11:34 AM) *
Hello.

This may seem like a stupid question and I feel stupid for asking it but how do I log onto the root user?

I'm using Ubuntu and I don't seem to be able to do anything with any programs because to do that you have be logged in as root, but when I tried logging in root on the main login page it said "System Administrator is not allowed to log in from this page"...

Yes I am n00b. Help me please!

Am I missing something that's blatantly obvious?!?!


The other thing you can do until you apply Michael's fix (above) is when you're logged into a shell as a normal user just try "su" and become the superuser that way.
  Forum: General Discussion · Post Preview: #30207 · Replies: 11 · Views: 22,200

markjr
Posted on: Mar 30 2008, 09:08 AM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


You could rinse out the commas before it ever hits awk:

sed 's/,//g' | awk .....

Or maybe specify the input field separator in the BEGIN section of the awk script itself, which I think is

BEGIN { IFS=" " }

(or whatever non-comma field separates your columns)

-mark


QUOTE (Larry @ Mar 29 2008, 05:35 PM) *
I'm trying to add a column of numbers which contain commas in a file called countries.txt; i.e, 123,456. The numbers are in the 2nd column of the file. The general structure of the command is:

awk '(tot+=$2); END {print "Total = " tot} countries.txt

When I execute this all I get is the sum of the numbers prior to the first comma. I know that I have to either parse the numbers or somehow get rid of the commas.

Can anyone tell me how to do this. Thanks.

Larry
  Forum: Programming in Linux · Post Preview: #30124 · Replies: 1 · Views: 5,270

markjr
Posted on: Mar 20 2008, 08:30 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Joey Olson, the person who founded this board back in 1997, has been promoted to Manager of Customer Support at Tucows, where he's been since 2001.

Well done Joey, and good to see you back on the board here.
  Forum: General Discussion · Post Preview: #30104 · Replies: 1 · Views: 6,778

markjr
Posted on: Jan 15 2008, 10:59 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


From a command line I would do this:

CODE
grep \| <logfile> | cut -d\| -f2


The "cut" command gives you access to a given field -f separated by delimiter -d, your original post said <i>before</i> the | but the rest of the thread it seemed like you wanted everything after the |, hence the -f2 but if you really do want before, just use -f1

So this wouldn't happen in realtime but maybe you could redirect to a new file, run out of crontab or something


CODE
grep \| <logfile> | cut -d\| -f2 > <newlog>
  Forum: General Discussion · Post Preview: #29822 · Replies: 3 · Views: 6,827

markjr
Posted on: Dec 18 2007, 09:15 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (pneumonochrome @ Dec 18 2007, 08:55 PM) *
I have a cron job that I need to run every 10 minutes. The line in the crontab looks like this:

CODE
*/10  *  * * *  root    /path/to/script.pl >> /dev/null 2>&1


Every time it runs, it creates an entry in /var/log/messages that looks like this:

CODE
Dec 18 17:40:01 gentoo-tk-www01 cron[16120]: (root) CMD (/path/to/script.pl >> /dev/null 2>&1)


I'd like, if I can, to tell cron *not* to generate that log entry, as 144 entries in /var/log/messages a day from a script that I'll know when it stops working without having to look at the logs are just a bit much for me. I apologize if this has been covered before, my search of the forums didn't turn anything up. I'd like to leave it logging when it runs the other jobs I have set to run via cron, as they generally don't run any more frequently than once a day, just this one I would like not to log.

Any ideas on how I might approach this?

Thanks in advance.


Add cron.none to the line in /etc/syslogd.conf that is logging to /var/log/messages

I always like to setup a separate cron facility in my syslogd.conf anyway, logging to something like /var/log/cron
  Forum: Technical Support · Post Preview: #29737 · Replies: 1 · Views: 2,679

markjr
Posted on: Apr 21 2007, 01:04 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Sounds like squid is having problems with SSL? https urls?

Did you upgrade any of your SSL libs lately?

How about a simple restart of squid? That's fixed some problems for me in the past when they just appeared out of nowhere.

QUOTE (sapheroth @ Apr 20 2007, 01:29 AM) *
hi
i m running RH9 with squid 2.5 stable7 and abt 50 users are connected with that. everything was working fine till yesteday.
from today all the users are getting problem with yahoo, hotmail and other web based mail server.
the error i saw on log file of squid was

when a user open yahoo mail page he is getting this error
"login.yahoo.com:443" and

though hotmail main page is accessable but when a user enters his username and password, the next page he gets is
"http://login.live.com/login.srf?"

expect this all the other websites and messenger are working fine.

when i access these sites directly from the internet they are working fine.

i dont know anyting abt this strange behaviour. can any1 plz tell me how to get rid of this problem.
  Forum: Technical Support · Post Preview: #28631 · Replies: 1 · Views: 4,729

markjr
Posted on: Jan 30 2007, 10:13 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


Well, I finally got this working, but I had to install CrossOver to do it.

Never could get it working native under linux (xandros, debian), got it to the point where I'd get this:

CODE
arkjr@stuntpope:~/wikipad$ python ./WikidPad.py
Traceback (most recent call last):
  File "./WikidPad.py", line 1, in ?
    import WikidPadStarter
  File "/home/markjr/wikipad/WikidPadStarter.py", line 29, in ?
    from pwiki.PersonalWikiFrame import PersonalWikiFrame
  File "/home/markjr/wikipad/lib/pwiki/PersonalWikiFrame.py", line 12, in ?
    import Configuration
  File "lib/pwiki/Configuration.py", line 7, in ?
    from wxPython.wx import wxPlatformInfo
ImportError: cannot import name wxPlatformInfo


I don't seem to have a wxPlatformInfo.py file anywhere, not being a python guy, I suspect this should have been generated but it hasn't been.

Then I tried running it under wine and ran into some other problems.

So I tried CrossOver and after some futzing it works. It took me a couple tries to get it recognizing my old *.wiki files, the trick was to select the sqlite database format. (Be sure to backup your old *.wiki files before you try opening them, because it creates a new empty file if you use the wrong database type and it can't read in the old file)
  Forum: Life after Windoze · Post Preview: #28518 · Replies: 4 · Views: 14,067

markjr
Posted on: Nov 13 2006, 09:19 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (DS2K3 @ Oct 12 2006, 02:14 PM) *
Much as I loathe XP, I must admit I dislke Linspire even more.


My laptop came with linspire pre-installed and I just could not figure out where to go next with it. Nearest I could tell, it had no windowing system, no GUI installed, just a stripped down bare-bones command set only with no easy path to install the rest.

So I just wiped it and installed Xandros. Which went a lot better.
  Forum: Life after Windoze · Post Preview: #28384 · Replies: 3 · Views: 9,048

markjr
Posted on: Oct 31 2006, 07:52 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


I use xandros, not Suse, but maybe it's similar: look in the lower left or lower righthand corner for the taskbar minimized to the bottom side, look for an arrow pointing in, if it's there, click on it.
  Forum: Technical Support · Post Preview: #28304 · Replies: 2 · Views: 3,630

markjr
Posted on: Oct 27 2006, 04:12 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (mkingiii @ Oct 27 2006, 03:13 PM) *
Ok, I know what I want to do can be done using a script file. However, I dont know how to write one. Can someone please help, this doesnt seem like that difficult a request for someone who knows what they are doing. Please help. I have 400000 files to run the commands on, so I really can not do this manually. I really need help to be able to apply the commands on a large scale.


You can use things like the "for" loop in bash, or xargs (man xargs)
  Forum: Programming in Linux · Post Preview: #28284 · Replies: 2 · Views: 5,432

markjr
Posted on: Oct 24 2006, 07:21 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


This is the wrong place to post looking for help in generating splogs and website scrapers, not to mention the trackback and comment spambots you're obviously intent on creating.

Since you recreated your account I deleted yesterday, you're suspended.
  Forum: Programming in Linux · Post Preview: #28263 · Replies: 2 · Views: 4,809

markjr
Posted on: Oct 23 2006, 07:12 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


When I get into a situation like this (where I can't get an obvious sed filter going), I just break out perl, which is pretty easy to use as a command line filter.

Try this:
CODE
$ perl -lne '/values \(.(.+)\@.+.,/;$x= $1;print$x;s/something\@localhost/$x/;print'< input.txt


Where input.txt is the datafile you posted. If I understood your request correctly, you get this:

CODE
INSERT INTO alias (address, goto) values ('everyone@marblemedia.com', 'everyone')
everyone-admin
INSERT INTO alias (address, goto) values ('everyone-admin@marblemedia.com', 'everyone-admin')
everyone-bounces
INSERT INTO alias (address, goto) values ('everyone-bounces@marblemedia.com', 'everyone-bounces')
everyone-confirm
INSERT INTO alias (address, goto) values ('everyone-confirm@marblemedia.com', 'everyone-confirm')
everyone-join
INSERT INTO alias (address, goto) values ('everyone-join@marblemedia.com', 'everyone-join')


QUOTE (marbleman @ Oct 23 2006, 11:04 AM) *
a snippet of the source file
snippit of what output should read follows
Here is the command I have at the moment
CODE
cat testfile.txt|sed 's/yogesh/everyone/g'|sed 's/ at /@/g'|sed 's/lists.mydomain.com/newdomain.com/g'|awk '{print $1}'|sed "s/^/INSERT INTO alias (address, goto) values ('"/|sed "s/$/', 'something@localhost')/"

and here is the output
I need to replace the string 'something@localhost' with the username from the first email address. I don't know how to do this, and I've tried playing with variations on sed, awk and xargs to plug that value in. I can't seem to get awk to work inside of a pair of ` when inside of a sed 's///" statement.

Any ideas on how to get the requested output string without using a text editor (there's thousands of entriesin the full file)? This should be easy but it escapes me sad.gif

Thanks in advance for any input you can give me on evaluating regular expressions inside of sed.
-mm
  Forum: Technical Support · Post Preview: #28250 · Replies: 2 · Views: 2,780

markjr
Posted on: Sep 3 2006, 05:35 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (cabe @ Aug 31 2006, 03:50 PM) *
C'mon guys! 17 views and no replies? Someone's gotta have an idea what's wrong here.


Just a shot in the dark, but do you need to specify a gateway?

That's what happened with me on my laptop. needed to do a

CODE
route add -net default gw 192.168.2.1


whenever I booted using the wireless interface
  Forum: Technical Support · Post Preview: #27950 · Replies: 2 · Views: 2,951

markjr
Posted on: Jul 30 2006, 10:33 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (TheGuyGuy @ Jul 30 2006, 10:15 PM) *
In cron, I execute:

*/1 * * * * echo "hello" > $HOME/hello

...and in e-mail I get a permission error:

/bin/sh: line 1: [my home path]/hello: Permission denied

Why?


Off the top of my head, I'm guessing $HOME isn't set in your crontab environment.

Try this:

*/1 * * * * echo "home is:$HOME"; echo "hello" > $HOME/hello

and see what shows up for $HOME in your email error report
  Forum: Technical Support · Post Preview: #27607 · Replies: 2 · Views: 2,835

markjr
Posted on: Jul 27 2006, 01:03 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (JoshS @ Jul 27 2006, 09:33 AM) *
The script is meant to initialize some environment variables and start a license server for an app I run often. Unfortunately I made a typo in the script and it craps out on startup. The boot process continues running a few more commands after some error messages related to this script then the screen goes blank and I can't do anything else. My setup was fine just before adding this new script so it seems that if I can somehow delete or edit this bad file all will be well again. Does anyone have any advice on getting to this file or booting in some way that might get me around this hang up long enough to correct the file?


Try booting into single-user mode and start by taking the script out.
  Forum: Technical Support · Post Preview: #27585 · Replies: 4 · Views: 3,385

markjr
Posted on: Jul 27 2006, 01:00 PM


./configure
***

Group: Admin
Posts: 62
Joined: 9-February 06
Member No.: 6,054


QUOTE (Music @ Jul 27 2006, 11:06 AM) *
Im running the 2.6.14.5 Kernel using Trustix, and I noticed that none
of roots cronjobs (like logrotate) were running. I restarted the fcron
daemon, but didnt seem to fix the problem. To verify this, I used the
fcrondyn program, which allows you do things dynamically with cronjobs.
So I ran fcrondyn in debug mode and here is what I saw:


Probably not the answer you are looking for, but everytime I setup a box, I do two things to cron:

1. add this to the crontab

<code>
* * * * * touch /tmp/cron.running
</code>

and

2. edit syslogd.conf and have cron.all log to a separate system log

The first just lets me do a quick visual check to make sure crond is always running, and he second splits out all the cron output to its own logfile so they don't get buried in the other system logs.

I'm not that familiar with fcron tho, sorry
  Forum: Technical Support · Post Preview: #27584 · Replies: 1 · Views: 1,990

2 Pages V   1 2 >

New Posts  New Replies
No New Posts  No New Replies
Hot topic  Hot Topic (New)
No new  Hot Topic (No New)
Poll  Poll (New)
No new votes  Poll (No New)
Closed  Locked Topic
Moved  Moved Topic
 

RSS Lo-Fi Version Time is now: 24th November 2017 - 03:31 PM