A Brief Word to Bloggers and Social Networking Participants Everywhere…Part 2

Here are a few rules that would go a long way to clean things up:

  1. Ask yourself: If I ran across this post without knowing the author, would I care?
  2. Is this adding value to the wealth of information or the organization of the web?
  3. No personal, isolated stories included simply for the sake of telling
  4. Did some thought and effort go into the composition and actual text of this entry?
  5. Are any scientific or concrete claims/statistics cited? You may have a great point, but if it’s based on stats that, for all I know, you arbitrarily dreamed up yourself, I don’t care.
  6. Make connections between your article and other related or available info online. Nothing exists in isolation and every story benefits from some context. Even quote related material from other sources.
  7. Copying someone else’s work, however, without a citation is not okay. Period.
  8. Avoid personal isolated stories or anecdotes about personal life.
  9. When commenting on the contributions of someone else, do so in a constructive manner. Far too often do we experience a battle of words between conflicting voices, which does nothing for the readership but polarize and interrupt valuable discussion.
  10. Finally, seek out stories from less popular sources. There is so much interesting stuff out there that gets lost in the shuffle. These things deserve a little extra publicity and will break up the monotony of alternating links to Lifehacker and Engadget.

Consider each of your contributions not as a soapbox from which to rant about trivial things in your own life, or selfish goals, but rather a contribution for the benefit of others around you. Try not to pollute this thing we all depend on and love so much.

A Brief Word to Bloggers and Social Networking Participants Everywhere… Part 1

There are a few classes of web pages on the web:

  1. Those from which I can glean some value from, as an anonymous visitor, and
  2. Those about which I couldn’t really care less, essentially consisting of a glorified journal entry by someone, somewhere, who assumes the rest of us care.

The web is the “wild, wild west” of information; a haven for free speech and expression as an open medium for the exploration and sharing of ideas and knowledge…

…But come on people. Those billions of “articles” from category 2 are hurting the greater good and dragging us all down into the shady realm of shoddy writing.

If you want a centralized place for friends to catch up on your recent activities, that’s fine. Just don’t publish to directories, and remove yourself from google. Believe it or not, most of us don’t actually care about your fender-bender this afternoon or the lousy guy you just spent the evening with.

We need an internet filter designed to cut through the crap and intercept the masses of junk floating around on the web before it can distract us from what we’re actually seeking.

The team working on the StupidFilter have a good start on this problem in the context of comments:

…an open-source filter software that can detect rampant stupidity in
written English. This will be accomplished with weighted Bayesian or
similar analysis and some rules-based processing, similar to spam
detection engines.

Drop a note in the comments with any ideas on how to clean this all up.

GoDaddy Applications

I’ve been hosting this website on GoDaddy for five months now and I’ve been incredibly happy with the experience.  Server uptime has been reliable, the administrative interface is very smooth and easy to user, and at $4.95/month, the price is right.

Something I hadn’t realized until this morning, though, is the suite of applications they offer.  Open Source apps and frameworks such as WordPress, Moodle, Drupal, Joomla and a collection of content management systems, wikis and eComerce solutions are all available from GoDaddy with every hosting account.

Once logged on to your account, one need only enter a few bits of information such as admin account login/password, and the location within the domain where each app should reside.  GoDaddy takes care of the rest and lets you know when the process is complete.  Now, of course each of these apps is something you could install and configure yourself with some work.  But where else are you going to find a system that gets a WordPress blog or Moodle server set up correctly and running in a sweet 5 minutes.

GoDaddy provides an all-around great web hosting resource for everyone, from basic users making family websites, to advanced users responsible for large servers and powerful applications.

A Few Web Development Tips to Secure Your Web App

This is somewhat old news (the original blog was written November 2006) but it’s a really good collection of simple things we can all keep in mind to avoid gaping security holes as we develop new web applications. Included are reminders about the vulnerabilities in browsable directories, plain-text variables, and freely-visable web stats from tools such as Webalizer (noted security vulnerabilities).

A few things I would add to the list include warnings about cross-site scripting vulnerabilities involved in careless variable passing (explanation here), SQL injection in forms, and predictability in the locations you place important files (system, admin, or install files). All of these items open doors for malicious outsiders to learn how your system is set up and find ways to exploit it.

You may think you’re safe from hackers, spam or any sort of malicious activity, but this stuff happens all the time to all sorts of websites. All it takes is a small hole and a script-kiddie with too much time on his or her hands to turn your hard work into a nightmare.

Michael Sutton’s Blog: Top 10 Signs You Have an Insecure Web App

ReCAPTCHA Putting Website Users to Work

ReCAPTCHA puts a new spin on the typical ‘Completely Automated Public Turing test to tell Computers and Humans Apart’ (CAPTCHA)

CAPTCHA describes a technique for restricting website access to spammers or bots by means of an image with scrambled letters that a human user should be able to recognize and authenticate. ReCAPTCHA combines a solution to this common problem with the ability to help scan and make digitally available the vast wealth of print media information in circulation.

How it works: Users are shown a background image with two words printed over it. One of these words is already known and understood by the computer but the other is one that it was unable to recognize when scanned off the page. The user then types in these words and is granted access if the response was correct. ReCAPTCHA benefits from providing the service because each time someone verifies him or herself this way, another previously unrecognized word is added to the project’s version of that text.

Interesting Security Exploits

Talking with my good friend Chris Mueller this afternoon, we stumbled across an article about a cross-site scripting sort of vulnerability that’s pretty wide-spread on the internet.

The general background is that many many dynamic websites, including probably this one, use forms or variables in the url of the page to communicate information from one page to the next. This includes things like login information, page choice, and virtually any link that changes over time. (mousing over the “Most Recent Entries” links at the right gives ….?entry=entry839402874 and such)

This is hackable because, though my password might be unhackable, once I’m logged in to the admin or user-privaleged portion of a site, a hacker can send me to a site that essentially gets me to do their work for them. They do this by adding a tag to a page. Because the user is already logged in on their site, when this page opens the link within the tags, it allows things to go through, changing database entries, deleting pages, or even adding users with administrative privaleges.

If only the vulnerability stopped there….

Unfortunately, it’s also possible to create a form with default values that automatically submits itself to your password protected page when the hosting page loads. And, just as the URL-based hack works, this tricks the site into thinking it’s the authenticated user making the changes and gives a hacker essentially free reign if they know what they’re doing.

Now, this isn’t something that will just randomly happen to anyone – it relies on the hacker deliberately planting these or tags on a page and getting a logged in user to visit the page. Nevertheless, it’s striking how many seemingly secure sites are open to this type of attack.

What can you do? SESSION variables cannot be spoofed in the same way as these POST and GET variables can so using them protects you against this kind of issue. Additionally, there are a variety of frameworks such as xaraya and other tools that generate authentication codes to verify that indeed submitted forms came from a safe location.

It’s a crazy crazy world out there…

SQL Server Migration

So I just want to say for the record, if you’re ever moving sql databases, use a migration and synchronization tool.

I tried shutting down the servers and copying data over. I tried using sql queries to do it. But no matter how careful you are, something will get corrupted, some constraints or dependencies will get lost. I used a free time-limited but fully-featured version of Red Gate SQL Professional Tools which made the process incredibly smooth and safe to do.

Oh, the things I wish I’d known before I began this endeavor.

AJAX Permissions Error

So you’re going along nicely with a small ajax application and you run into this:

Error: uncaught exception: Permission denied to call method XMLHttpRequest.open

AJAX restricts server-side requests to local servers to avoid security issues that would potentially allow javascript to make calls to hostile servers. This can also show up even within your own server if you’ve got a hard-wired path to the query page and the domain has been entered differently (http://www.mypage.com vs. http://mypage.com). Give relative paths, it makes everything more pleasant.

Manual Set a 404 Header

So this is one of those posts where I blog something to remember it for that inevitable day when the same problem will plague me years down the road.

The issue is that google doesn’t like custom error pages and keeps them in their index without establishing that in fact they are bad pages that drag down the overall quality of a website. For a dynamic, database-driven site, inevitably the info on a website changes over time. Generating pages that forward on to the new version, or give an informative error message help visitors stay on track on the website without getting too terribly lost. The following line of code helps tell google that, whatever’s happened to the page it’s looking for, it’s not there now.

<cfheader statuscode=”404″ statustext=”File Not Found”></cfheader>

As long as you’re at it, you might as well have the page sending you a message about the error so you can go hunt it down and fix it for future users.

<cfmail to=”sample@email.com” subject=”Error: Outdated or Invalid link” from=”sample@sample.com” type=”html”><br />Someone just encountered the following error:</cfmail>

#arguments.errorText#

Page: #cgi.HTTP_USER_AGENT#

IP: #cgi.REMOTE_ADDR#

Offending Script: #cgi.SCRIPT_NAME#

Query Info: #cgi.QUERY_STRING#

Referring Page: #cgi.HTTP_REFERER#

Coldfusion hosting frustrations

So I am one of the first to tout the exceptionally efficient coding shortcuts associated with ColdFusion and in most cases it makes the job of a webmaster much easier. All of that is true if the web server is running properly as intended. Lately, however, I have come up against a suite of problems with keeping the server going without hitting all sorts of JRUN errors.

I host a personal coldfusion photo gallery with an online company Host Department and several times a week, one of the other users placed on the same server crashes it and it goes down for everyone. I’ve also been working on another ColdFusion driven website for a non-profit organization. In this case it dies about once a week or so without explanation or note in the logs. If the server simply restarted, that’d be one thing, but it stalls and can’t do anything to automatically fix itself.

Now, admittedly this is something that can be fixed in each of these cases, that is the role of those in charge of the server. It’s part of my job to set it up so it doesn’t break in these ways. That said, with the number of posts on this issue floating around the web, you’d think a more robust implementation would be possible.

ColdFusion is supposed to be simple and easy while retaining its power and flexibility. Dealing with server issues like this strike me as neither simple, nor powerful and send me looking for more stable alternatives.