Recently I was playing with the linux command line .. I was testing out a script which i was interested in..
i ran a wget command and download the file to the server and then extracted it using unzip
But the worst part is that the after the extraction i ran a mv command to move the files to root directory
And the command replaced the sites’ index.php .htaccess and robots.txt
I did not bother for a couple of days until I figure out that the site is not working as the main index file is broken and also google and other search engines started deindexing the site as the urls were broken and worst part is that robots.txt had a Dissallow * line
Its a Very painful mistake both for me and the site ..
So the lesson learnt is that Never try out new things / change files on a live site without prior testing even if ur confident enough ..
Such mistakes can cause irreversible damage to the site and might take days / month to discover ( glad i found in 2 days )
IM not surprised of the incident when Google recently flagged the whole web as Malware.
So having a Local , Dev or Staging environment .. even on same godamn server or machine would make a HUGE difference in such calamities
Daily backups were disabled for this site as its small one .. and the weekly backup was taking ages to extract
Only the host Servint saved us from this disaster .. they restored the files needed from a recent backup and we were back on track.
We had about 10hrs of downtime due to this negligence .. Unforgivable !