I have a directory on this site which is linked from nowhere and I know that for an absolute fact. There is no way it has been indexed. I could put it’s name into robots.txt but then you could look there and see what I don’t want the search engines to see. For the curious, here is my robots.txt
User-agent: HenryTheMiragoRobot Disallow: / User-Agent: OmniExplorer_Bot Disallow: / User-agent: * Disallow: /images/ Disallow: /catch/ Disallow: /gallery-ink/ Disallow: /nota/ Disallow: /stats/
Of those, /images gives you a 403, /catch no longer exists, /gallery-ink is renamed, /stats is really old and the /nota directory will ban you. Really it will – it’s there to catch bad bots. Anyway, the point is that the directory I mean is fairly secure. You can’t find it.
Now to my point. I have a lot of information I need to keep and keep safe. It’s backed up here but I’m thinking of an online backup too. A wiki. But how to keep that safe given there may well be a link somewhere – it would take just the one for the bots. So if I have cpanel information and blogs logins for people – which I do and it’s a lot of people (and no, its not FreshlyPressed clients) – how can I keep them safe?
I like .htaccess file and passwords like g&unbj8[_1-7Xa but I regularly see some people claim that such files are easy to crack/bypass and they offer little security. Do they?
If I was to store your cpanel / blog logins on a site of mine what would be your chosen method of protection? It’s got to be easily usable should I need too. If I said here is the link and cracking it would get you information how would you want that protecting? How would you protect it?
I’m forever telling people that once it’s on the net should must assume it’s there for the world to see and you cannot whinge if that actually happens so I suppose I want to see if I can have my cake and eat it.
What’s the best way I can protect my information?