TechBlog
leaflet
#js1k entry finally complete - AsciiBrot1K:http://urlm.in/flii - from T-SQL to JavaScript and then down to just 1KB of hand-compressed code

Cisco.com's lost a letter...

Posted on 25 September, 2008 by maximinus in Web design, Web development
And by a letter, I don't mean a single character - I mean all instances of the lower-case letter "t" in the HTML source of the site.  This includes all HTML tags and their attributes (names and values) - leaving the site without styles and javascript:

Here's the top of the page:

Update: Cisco seem to have fixed the issue - whatever it was.

No comments have been posted on this entry. Click here to post a comment.

Slightly Stricter Scripter

Posted on 6 August, 2008 by maximinus in JavaScript, Web development
A while back, I started writing a web app, which still hasn't quite made it to public availability - but which I use myself, even in its incomplete state.  I knew as of the first public beta of Firefox 3 that this app didn't seem to work at all in Firefox 3 - but I was never terribly bothered about why.

Tonight, I decided that it was finally time (since I now use Firefox 3 on my laptop, desktop PC and work PC) to investigate the cause of the problem.  For some reason, nothing was rendering at all once I logged in to the app - I was left with a white page.  There's some static HTML content which should have been displaying, even if there had been JavaScript errors - which Firebug reported none of.

I started using Firebug to poke around, and I noticed that the static HTML content was indeed being returned in the HTTP response - but for some reason was not being rendered.  I looked a little more closely, and noticed that I'd been lazy with one of my script tags, and closed it as a one-sided tag (<script src="..." type="text/javascript" />) - which is technically invalid, but Firefox 1.5 and 2, Opera and even IE6 (I haven't actually tried it in IE7) didn't seem to have a problem with.  Firefox 3, on the other hand, handles this error in a rather odd (and annoying) manner: it eats the rest of the page content.

What I mean by this is that the page content totally disappears from the DOM - it appears that it first treats everything which follows the malformed script tag as the tag's body - but then downloads the external .js file and replaces the tag's body with its contents, thus losing the rest of the page.  If, like in my case, the malformed script tag is in the <head> section of the page, you'll simply get a white screen, since the entirety of the <body> section is lost in this way - along with any onload events defined in the opening <body> tag (hence the lack of JS errors).
Currently listening to: Coldplay - White Shadows
No comments have been posted on this entry. Click here to post a comment.

smtp2web

Posted on 9 June, 2008 by maximinus in Web development
After introducing me to Google App Engine upon its release, Arachnid has just released a nifty tool designed with App Engine in mind - smtp2web, an SMTP to HTTP gateway.  If you're looking for a quick and easy way to allow your web application to receive emails, look no further - you can either set up a single @smtp2web.com email address, or point an entire domain's MX at smtp2web, and it will POST the messages it receives to your site for processing.  You don't have to use App Engine to use smtp2web, but it's very simple to use smtp2web with App Engine apps...
One comment has been posted on this entry. Click here to view.

CAPTCHA'd

Posted on 23 May, 2008 by maximinus in Web design, Rant, Web development, Interface design
I'm sure you're all familiar with CAPTCHAs - those annoying things which require you to type in a bunch of letters and numbers from an image which tries to make it difficult for computers to read said characters.  I've seen several different varieties - ranging from fairly basic text (which doesn't do much to hamper OCR efforts), through ones which use really curly fonts, lines through the text, shapes intermingled with the characters, ones which have a picture of a cat or a dog on each letter, asking you to enter all those with a cat or all those with a dog.

The original idea behind CAPTCHA systems was good; I agree with it in principle.  However, spammers have found ways around many of them, including "data entry business opportunities" - which they send spam about, luring people to fill in CAPTCHAs for them, for use in submitting spam to websites - or, alternatively, simply getting them to post the spam.  Unfortunately, the trend seems to be to make the CAPTCHAs harder to read, which doesn't cut down on this as much as it would if the spammers were simply using OCR to attempt to decipher them.  All it does is make it more of a nuisance for legitimate users of the system.

My latest struggle with such a system involved an attempt to sign up for a forum account - which took me three goes.  Each attempt involved attempting to decipher a CAPTCHA image, which was so poorly done that 5 and S were utterly indistinguishable (perhaps they would have been, had I seen both at once; however I only ever saw one - (it looked like) the same one, in each of the three instances... I can't remember which it turned out to be).  Not only that, but I had to enter my desired password twice on each attempt, and answer a (fairly straightforward, although sometimes slightly ambiguous) question, which, like the CAPTCHA, changed each time.  I was so frustrated with this that I was going to completely give up on registering if it had failed me one more time.

The biggest problem with this was not even that the CAPTCHA was unclear - it was the fact that I had to attempt not only a new CAPTCHA, but a new human verification question each time (despite having passed the first and second), and re-enter my password a further two times per attempt.  Some systems also offer an audio alternative to the image; this option was also missing from this particular system.  Without this audio alternative, even if I could decipher all the other characters in a given CAPTCHA, if it had an S or a 5 in it, I had a 50% chance of failing it.

I wonder just how many people give up on posting a comment, registering an account or performing some other action on a website, simply because they can't decipher a CAPTCHA image?

EDIT:
Oh, one more thing - Sam Ruby raises a good point, which is highly related to my recent experience - when you've verified that somebody's a human, remember it!  Sure, expiry is probably a good thing - re-check periodically.  But presenting three different questions as well as three different CAPTCHAs and requiring me to type (and thus send via unencrypted HTTP) my password six times in order to register, simply because one of the two forms of human verification is poorly designed?  That's just overkill.

UPDATE FOR TAGGED.COM USERS:
Several people have posted comments stating that they are having problems with a "captcha fail limit exceeded" error on tagged.com.  I have removed all these comments as nobody was getting anywhere.
Tagged.com's help section states that the problem has been fixed - click here for more details.
If you are still experiencing this problem, you'll have to try contacting Tagged's support - click here.  I cannot provide any further help or information, as I am not a user of Tagged.com or in any way associated with it.
No comments have been posted on this entry. Click here to post a comment.

POSTing cfajaxproxy

Posted on 1 April, 2008 by maximinus in JavaScript, Web development
On a similar tack to the URL length limit mentioned in my post about the moonwalking kiwi, today I discovered a minor issue with cfajaxproxy, along with the (very simple) solution.

By default, cfajaxproxy will use HTTP GET requests to interface with the server, which is all fine and dandy if you're just passing in an ID or two to the function, and expecting data back from the server.  However, if you're using the proxy to send data back to the server, you'll probably want to set the proxy to use POST rather than GET - otherwise you run into length limits related not only to the browser (and its XmlHttpRequest object), but of the webserver.  IIS in particular seems to have a lovely habit of returning cryptic 500 errors complaining about bad verbs.

As I said, once you've found out how to do it, it's very simple to make a proxy use POST.  Once you've created the proxy class using the <cfajaxproxy> tag, you instantiate it as per usual, and then use its setHTTPMethod function:
<cfajaxproxy cfc="yourapp.cfcs.bar" jsclassname="Bar">
<script type="text/javascript">
var foo=new Bar();
foo.setHTTPMethod('POST');
</script>
You can then proceed to use the object ('foo' in the example above) as before, without having to worry about exceeding the URL length limit(s).
No comments have been posted on this entry. Click here to post a comment.

Invitation to Spam

Posted on 12 March, 2008 by maximinus in Rant, Web development
Running ShrinkThisLink, a free link shrinker, isn't quite as easy as you might expect.  As I recently mentioned, I recently commissioned a new spam detection system in order to try to pick up on links being used in spam without needing anyone to report the spam.

So far, this seems to be working quite well.  One way I can tell that it's working is that the number of people emailing me in relation to spammed links has increased.  You may be wondering right now how such an increase can be a good sign.  If you are, you evidently haven't run a website which spammers attempt to (ab)use.

I endeavour to reply to every single legitimate email sent to ShrinkThisLink - by hand, not with any kind of automated system.  Unfortunately, some people not only fail to recognise the email as spam and discard it, but proceed to click the links contained within.  When faced with a page informing them that "the link you have attempted to view has been blocked due to spamming or other abuse" and providing them with an email address to contact "if you believe this is in error," some of these people then proceed to contact that address and ask for further details on the spammed offer.  Surely replying to the email would make more sense?  I've even had some who forward the spam on, which, I will admit, is a step up from the people who complain about a link being blocked, but don't actually mention what the link in question is.


Between the reports of spam and the spam that's been forwarded in requests for further information on the "fantastic offer," it has become evident that spammers have realised that they really don't need to set up a mail server (or compromise one) in order to send their spam.  A disturbing trend is to use Yahoo! Groups invitations as a medium for spamming.

Yahoo! Groups invitations can be sent to any email address, and can contain text (including links) specified by the person sending them.  Spammers are taking advantage of these two facts to point people at websites completely unrelated to Yahoo! Groups en masse.  Yahoo! don't seem to be willing to do anything at all about this problem - I have personally reported several sets of Groups invitation spam, and have seen no evidence of them taking any action whatsoever.

My suggestion to Yahoo! - and indeed to anyone who currently offers an "invitation" sysem which allows the user to enter email addresses and arbitrary message content - is quite simple: change your approach.  I understand that the concept of inviting people to the website can be useful; however providing a form which accepts whatever the user provides, slaps it in an email and sends it to whatever addresses that same user provides, is the wrong way to go about it.  If you want to provide an invitation system, give the user a system which generates invitation links - either time-limited or single-use links.  Make the user send the emails themselves.  If they're genuinely trying to invite people (who they know) to the site, they'll be happy to send the links themselves (either via email or another form of communication such as posting the link on a blog or website, or sending it via instant message).  Spammers won't be so interested in the invitation system, though, since it won't actually benefit them in any way.

If, for some reason, you really think you need to keep the email-sending system, do not allow URLs in the message content (or, alternatively, don't even let the user edit the message).  If it's an invitation to a website, the website sending the email should automatically add the invitation link - and that should be the only link necessary (except perhaps a "don't send me these annoying invitation emails in future" link).  Please stop inviting spammers to send as much spam as they like through your site for free.  This means you, Yahoo!.
No comments have been posted on this entry. Click here to post a comment.

Moonwalk

Posted on 19 February, 2008 by maximinus in JavaScript, Web design, Web development
It's not all fun and games in web development.  Sometimes, clients request odd things, such as redirecting the domain name for a defunct Japanese site of theirs to another Japanese website.  We were given the URL to redirect to; upon clicking through to the site, we discovered that it was an equally dated site; not only are some of the links broken on certain pages, but there's an animated GIF of a kiwi in the corner of the site.

One of my colleagues pointed out that it looked as if the kiwi was moonwalking on the spot.  This gave me an idea to liven up the site - make the kiwi actually moonwalk across the page header.

Of course, not being one of our sites, we don't have access to the files, so I can't make it permanently happen.  What I can do, though, is use a javascript: address in the address bar to play around with the page once it's loaded.  622B 577B 477B 494B of JavaScript later, I have a moonwalking kiwi:
  1. Visit the site, and wait for it to finish loading.
  2. Grab the moonwalk JavaScript, paste it into your address bar (making sure it's all on one line) and hit enter.
  3. Enjoy.
You may have noticed that there are two three crossed-out sizes; I decided to do a bit of optimisation and refactoring, bringing it down to 577B, then thought that while I was at it, I may as well tackle the IE issue.  This is where I discovered something odd:
Internet Explorer 6 will only allow 501 characters in the address bar before it fails.  It fails silently if you exceed this limit.  As a result of this, I started shrinking the code as much as possible, which involved a bit of code golf with a few guys in an IRC channel - we decided to call this "JavaScript Hack Minigolf."  The end result was a lean 477 characters.  I also worked out what was happening in Opera - the version I've got installed (9.02) appears to, at least by default on Windows, stop the script running after a set period and return things to their initial status.  It's apparently fine in 9.24 on Linux, though.  I've now discovered that IE wasn't aligning the inserted div properly, and the fix has now brought it back up to 494 characters - thankfully still within IE6's limit.  I've also found that IE7 doesn't have the 501-character limit (I haven't checked if there's a higher limit or no limit).

Thanks to Arachnid for playing a round of JSH Minigolf with me, and to Pic, silentcoder, langly and warfreak2 for their moral support.
Currently feeling: Devious
Currently listening to: Roads - Roadrunner United
No comments have been posted on this entry. Click here to post a comment.

Googlebomb?

Posted on 15 January, 2008 by maximinus in Rant, Web development
After 40,000+ pageviews and several inbound links, my posting here a couple of days ago entitled "Has Google gone MAD?" managed to plant itself in some very high positions in Google search results, including the #1 spot for the search term "Google gone mad" - which I thought was quite neat:


This evening, however, I went to show somebody that I was, in fact, top of the search results for "Google gone mad" - only to find that I wasn't.  I hunted through the first couple of pages... no sign of my blog... I went through the entire top hundred - and still nothing.  Then I searched for the entry directly by URL - and sure enough, it's not there.

I can only surmise that it's been removed from the Google index because it's been detected by Googlebomb prevention systems - gaining several inbound links and shooting to the #1 spot within a few short hours does seem somewhat suspicious, I guess.

This does, however, seem like something which could all to easily be abused; if people could previously perform googlebombs such as the infamous George W. Bush "miserable failure" bomb, surely it wouldn't be hard for people with malicious intent to similarly bomb a 3rd party's website, thus getting it completely removed from Google's index.  The impact is not as high as it could be, since it only seems to affect the single URL, not the entire domain; however I think it quite likely that there's some kind of threshold at which the entire site would be blocked, i.e. after a certain number of bombs on the one site, it would be assumed that the entire site is up to no good.

Update (16 January):
This evening, I checked again, and found that my top position has been reinstated.  This leaves me wondering quite what has been happening...

Update (21 January):
For the past few days, I seem to have lost the top spot again (for "google gone mad") - I can't find this site anywhere in the first few pages of results.  However, it does seem to be top for "google gone mad techblog" - so it's back in the index, but has completely lost its ranking for "google gone mad" - which seems to indicate that this effect could still be used to kill specific keywords / key phrases for any given website.

Update:
My blog seems to have settled again in top spot for the original phrase "Google gone mad" - so perhaps the detection systems respond to the new link rate dropping by slowly returning the page's rankings.  Sustained new link generation, on the other hand, could possibly keep a page out of the rankings on a more long-term basis; however if the rate is calculated based on number of unique websites (rather than webpages) which link to the site, it may prove difficult to sustain the required rate.
No comments have been posted on this entry. Click here to post a comment.

ColdFusion 8 AJAX

Posted on 22 October, 2007 by maximinus in Web design, Web development, Interface design
I've lately been playing around a bit with the new ColdFusion 8 AJAX and related stuff.  There are a few different things I'd like to touch on here:
cfajaxproxy
This one's a really great idea - basically, with one tag, CF8 will automatically create a JavaScript class which mirrors the CFC you name.  It acts as an interface for the CFC - you create an instance of it in your JavaScript, and then call the (public, remote-access) methods from the CFC as if from ColdFusion.  It returns whatever the CFC's method would return.

I'm using this to easily turn a static calendar (based on Randy Drisgill's Simple ColdFusion Calendar - I fixed the year bug (my way, see comment on Randy's blog post), converted it to a method in a CFC, changed it to use DIVs instead of tables and generally tidied things up, rewriting some chunks of it.  I then made it so that it doesn't require reloading the page, but instead uses JavaScript to swap out the calendar for a whole new one.  I've yet to make it also display information in the day DIVs, but that will be coming soon.

More at CF8 Livedocs.


cfwindow
This one's also kind of neat; it allows you to easily create a draggable, closable 'window' on the page.  You can either have it show on page load, or use JavaScript to trigger it to pop up whenever and however you want.  You can either specify content within the cfwindow tags, or specify a source file - if you specify a source file, it'll load that page using AJAX and shove its contents into the 'window' - no, not using an iframe, but actually using AJAX.

I do have a couple of gripes with it, though:
  • It's not the easiest thing in the world to style, and the default styles are a bit crappy - especially with some colours behind it, which can make some of the outer lines 'disappear' and thus make the box look a bit odd; and
  • It doesn't have a 'minimise' button.  It's based on Ext JS - which does have a 'minimise' button, which shrinks the 'window' to its titlebar.
I'm really not sure why there is no way to add the minimise button in the cfwindow tag; it'd be rather handy.  I'll have to have a play and see if I can add it by gaining access to the underlying Ext JS window object.

More at CF8 Livedocs.


cflayout / cflayoutarea
Okay, so I've only actually used the tab layout so far.  But even so, I've discovered a few cool things and a few annoying things.  First off, it's really easy to use these tags; just nest a new cflayoutarea within the cflayout tag to add a new tab.  Just like cfwindow, you can either specify the contents of each tab between the cflayoutarea tags, or use the 'source' attribute to specify a file which will be loaded in using AJAX.  With both cfwindow and tabs, you can also use refreshOnActivate (refreshOnShow for cfwindow) to tell it to fetch a new copy of the contents (when using 'source') when opening the window / showing the tab.

Issues I have with the tabbed layout are:
  • The tabs are rather poorly styled by default; the top left corner of each tab in particular looks odd / broken.  Thankfully, FireBug showed me that it's relatively easily to restyle the tabs - I've made mine quite plain and simple at the moment;
  • The 'align' attribute of the cflayout tag relates to the tab content, not the tabs themselves - surely if I wanted to change the alignment of the content, I'd do it at the content level - or perhaps there could be an 'align' attribute on the cflayoutarea tag?  Tabs are stuck at the left hand side, unless you manually shift them with CSS as I have done; and
  • There appears to be some weirdness with regards to the height of the content area when it contains absolutely positioned elements.  I'll need to investigate this a bit further.
More at CF8 Livedocs - cflayout and cflayoutarea


There may be more later, but right now I can't think of anything to add.
No comments have been posted on this entry. Click here to post a comment.

Xtraordinary Complexity

Posted on 23 August, 2007 by maximinus in Web design, Rant, Web development, Interface design
Last weekend, Xtra performed an email system "upgrade" which resulted in their customers' mail being inaccessible for 24 hours.  This in itself is not necessarily bad - they did give warning that it was going to happen, and it's understandable that such things need to be done from time to time.  The problems begin with the length of the outage - a full 24 hours, with some customers having no access for longer than this.

The main problems, however, relate to the consequences of this "upgrade" - and begin with those who use a mail client (i.e. not webmail) to access their email.  As part of the upgrade, Xtra changed their SMTP server address - and added mandatory SSL.  They do inform you (oddly enough, when you're trying to get to the "upgraded" webmail - but also over the phone if you lie and tell them that it's not to do with the mail server "upgrade" and thus can get through to a person rather than pre-recorded messages) that the address has changed - but never make mention of SSL.  They list the new port (465) and address (send.xtra.co.nz) but completely fail to mention that SSL is now mandatory.

This is a problem for those users - like my aunt, who has been unable to send email for several days until I was able to visit and sort it out - who do not know a lot about computers, the Internet or email, who perhaps, like my aunt, have had their computer set up by family or friends and know how to use it but not how to configure it.  In the case of my aunt, she managed to work out where to change the settings - with some help - and changed the address and port.  However, since there had been no mention of it at all, she did not enable SSL - and so still could not send email.  Although I know a bit about mail servers etc, and in fact run my own mail server, I didn't immediately recognise port 465 as being the standard SMTP over SSL port.  How anyone else is supposed to work it out, I don't know - a bit of deft googling managed to turn up this article on the Xtra site which eventually mentions that you need to enable SSL.


The next problem, which is probably even worse than that one, relates to the web mail system.  It used to be a relatively simple process to get to and use their webmail system - but no longer.  Especially if you haven't used the new system yet.

First off, you need to use a modern browser.  If your browser isn't supported, it doesn't tell you - it just sticks you in a loop of signing in, clicking through to continue a couple of times, and then being returned to the login page.  Once you find a browser in which it works, you have to go through several steps of pointless nonsense, including downloading and installing a few bits and pieces relating to their new "bubbles" - this took a few minutes on my aunt's ADSL connection; I shudder to think how long that would take on dialup.

Once you've finally managed to register for the new system, you log in and end up on an overcomplicated, customisable start page.  When you eventually locate the "Mail" link, and you move your mouse over it, a new box "slides" out from under it to reveal a summary listing new messages - just how good this is, I'm not sure, as my aunt had no new messages, so there was a large box with a small amount of text swimming in it to that effect.  Clicking on the Mail link took us to the new webmail interface - which I didn't have a good look at, but didn't look terribly easy to use or particularly good.  I think it might be using the current Yahoo! mail system, but, not having a Yahoo! account myself, I can't verify this.


Then there's the entire concept of a social networking site.  I would imagine that their users would fall into two broad categories:
  • Those who, like my aunt, are not at all interested in this crap; and
  • Those who are interested in a social networking site, and, as a result, are already signed up to at least one of the plethora of other free social networking sites out there
Also - I haven't investigated, so don't know if this is entirely accurate - surely using this system would be somewhat pointless, as I'd assume that only Xtra customers can get a "bubble page" or whatever it is they're calling them.  Even if other people can sign up for them, will anybody who's not an Xtra customer do so?  I would suggest that the answer is almost certainly "no" - at best, a few people might sign up out of morbid curiosity.  This means that you're essentially restricted to networking with other Xtra users - whereas if you use any of the other social networking sites out there, you can network with anybody with access to the Internet.


I'll admit that their old webmail system was old and was begging to be upgraded or replaced; but this is not the way they should have done it.  What they've done is alienate a lot of users, confuse many more and just brass off the rest.  That's just those of their customers who actually use their Xtra mail, of course - the rest of their customers won't even care in the slightest.

I think I heard that Xtra were saying that "Bubble" was going to provide "an exciting range of new services that will change the way you use the internet" - I'd say that the only way it has changed the way that some people use the Internet is which provider they use it through.
No comments have been posted on this entry. Click here to post a comment.

New ShrinkThisLink Site Live!

Posted on 15 July, 2007 by maximinus in PHP, JavaScript, Web design, Web development
Finally, after months of on-and-off work, the new ShrinkThisLink site has gone live.  It features a new design, AJAX shrinking of links, instant link conversion using Javascript, an improved My ShrunkLinks feature (including link deletion) and more.  Hope you like it - your feedback is appreciated, either via comments on this blog post or by email (use the Contact link on ShrinkThisLink).
Currently feeling: Relieved
No comments have been posted on this entry. Click here to post a comment.

Squatters Galore

Posted on 27 January, 2007 by maximinus in Rant, Web development
Domain squatters, that is.  Squatters, campers - whatever you want to call them.

There's a site I'd like to develop, but I need a name for it.  A short, snappy, memorable, relevant name - that has an appropriate domain name available.  Every single idea that I or anybody else who's tried has come up with has been taken by domain squatters.  Some of them have the cheek to ask ludicrous amounts of money for these domains - such as one boasting a whopping 1 hit per month, which wanted USD$1650, with a minimum/reserve price of USD$1000.  Others are spammy search portals, and still others simply leave it to languish with no accessible website on them.

Why is this allowed to happen?  Why are these bastards allowed to register these domain names, and then not do anything of any use with them - especially those who are trying to sell them?  This is completely ludicrous, as it's really no different to buying a sandwich and then offering it to people for a hundred times the price - other than the fact that you can easily by another sandwich which will do the exact same job, at the same price that the other person bought theirs for.

I believe that there should be some kind of regulation, perhaps administered by a body such as IANA, whereby the registration is revoked on any domain name which is registered and then attempted to be on-sold without a site first being established.  The same should also apply for any domain name which is used solely for spammy search portals, and perhaps also domains which are not pointed anywhere (and not used for mail, etc) for a certain period.  It would probably be far too labour-intensive to have this body check all domains, so it would be best run on a reporting basis - if you find a domain that you'd like to register, and it's taken by a squatter, you report it and they investigate; the registrant then has a chance to defend their right to registration, and if they can't prove that they have legitimate cause to have the domain name, it is revoked and the reporter may register it.

For crying out loud, even domainsquatters.com is taken - by a domain squatter - as are cybersquatter.net and cybersquatters.net.
No comments have been posted on this entry. Click here to post a comment.

Rank This!

Posted on 7 August, 2006 by maximinus in Rant, Web development
As if I didn't already know, I've just seen absolutely irrefutable proof that website ranking systems like Alexa are extremely inaccurate.  By only having access to data from people using their toolbar, their stats can be extremely skewed.

Today, for some reason, ShrinkThisLink (the free link shrinker which I created and run) has spiked to an Alexa "Daily Reach" of 10 per million users.  This makes absolutely no sense, since according to Google Analytics and Awstats, two stats tracking systems which I use and which both collect actual data (awstats from log files, Google Analytics from javascript embedded in pages) both show that if anything, traffic on Saturday (which is the day the Alexa information is apparently for) was BELOW previous days.  The only explanation that I can come up with is that for whatever reason, a higher number of those visitors were using Internet Explorer (which my stats do seem to confirm) with the Alexa toolbar installed.  This proves that it doesn't take a lot to cause reasonable amounts of change in the Alexa statistics, since when you average out the higher percentage of IE users and the lower number of overall users, you should end up with about equal numbers - demonstrating just how skewed these systems are.
One comment has been posted on this entry. Click here to view.

ShrinkThisLink.com

Posted on 27 October, 2005 by maximinus in Web development
Well, I decided to get the domain name ShrinkThisLink.com and set up a free link shrinking service - no ads, either.

If there's a web page you want to share with people, but the URL is long and convoluted, simply paste it in to ShrinkThisLink.com and hit the button, and it'll give you a nice short URL to give to friends etc - making it easier for those people to visit the page. It's especially handy if you want to put the URL in your MSN Personal Message, as there's no way to copy from the MSN PM to the clipboard, and links in it do not become clickable - so the URL has to be typed out to get to it.
No comments have been posted on this entry. Click here to post a comment.