Thursday, December 18, 2008

The End of P2P

Ok, that was a little bit of an exaggeration but two years ago P2P was dominating internet traffic and now, while still dominating the upstream networks, P2P is starting to drop in relation to real-time traffic. I see the time, in the relatively near future, where P2P is a niche market in comparison to real-time traffic. This is not to say that P2P usage will drop in absolute terms, only that gaming, VoIP, online file storage, movie, real time news and sports is going to increase tremendously over the next few years.

What will this do to theNet Neutrality arguments? Real-time communications can be buffered for only so long; a web page can be delayed for 500ms without too much complaint and an email can be delayed for 1000s of milliseconds without notice.

Thursday, December 11, 2008

The Importance of Design

How important is design? My designer friends say it’s the most important thing. When I see how they translate my wireframes and HTML mock-ups and turn them into pixel-perfect design I see the difference immediately. It changes what was a good idea into something "PROFESSIONAL."

But look at some of the most important, most influential, most profitable sites and you see terrible design and an irreverent approach to pixel-perfect design: IMDB is an industry standard but it's ugly as hell. Craigslist is ugly; and how about Google, eBay and Amazon? The beauty of those sites is in their functionality.

Sunday, November 30, 2008

Great Australian Firewall

In an attempt to combat child pornography the Australian Government is proposing a mandatory filter of all internet traffic! There can be no doubt that sexual slavery needs to be attacked but this proposal is ridiculous. At first I thought this might have been made up but unfortunately it's very real.

For further reading:

Can Labor Implement Clean Feed
Filtering Pilot and ACMA Blacklist
Great Firewall of Australia

WIPA.org.au
GetUp.Org
NoCleanFeed.com
IIA.net

Sunday, November 23, 2008

Text Msgs are surpassing Voice

Nielsen Research bears out what many people think - that text messaging is soon going to supplant voice as the primary means of communication.

During the second quarter of 2008, a typical U.S. mobile subscriber placed or received 204 phone calls each month. In comparison, the average mobile customer sent or received 357 text messages per month — a 450% increase over the number of text messages circulated monthly during the same period in 2006.
In U.S., SMS Text Messaging Tops Mobile Phone Calling

At this point text has not supplanted voice as one can't make a one-to-one comparison between text messages and voice phone calls. Still, one needn't be a statistical genius to see that if the rate of growth continues at the present rate that the data transmitted by text will shortly vastly surpass voice.

The Nielsen data shows that phone calls have leveled off at approximately 200/month with texts up to 350/month. Soon text/email msgs will be the primary means of communication and voice messages will become a vestige of the past. Yeah!! No more "call me msgs."

It must be pointed out that the Nielsen data pertains only to cell phone users and not to landlines. Is there still a use for landline phones? Maybe for some, maybe rotary phones are still useful for emergencies, but for day to day use land lines are going the way of answering machines, VCRs, and cassette decks.

What does this mean for you - the business customer? Stop putting out alpha-mnemonics based on the rotary/push-button phone. Soon no one is going to be able to use it - let alone remember that a 2 is ABC and that a 3 is DEF, etc...

Friday, October 24, 2008

The Birth of Computers

If you're interested in the history of technology then this video is for you. If you're a programmer and you're interested in the history of technology you will LOVE this.

Wednesday, October 8, 2008

CSS Tables

I don't usually make posts simply linking to someone elses work but Smashing Magazine has an excellent article on CSS and Table Designs

I'm glad to see that not everyone is a table hater. There is, after all, a place for tables.

Thursday, September 25, 2008

Opening up New Windows

According to Nielsen, the only thing the average user can be counted on knowing is that the back arrow cycles back a page. That's it. Therefore, when linking to another site don't open up a new window.

In general I agree with this rule of thumb. However, I disagree completely when it comes to external links via a blog. First and foremost blog readers are already a self-selected user group. It's true that there are always new users to blogs, who are unaware of the rules of blogging etiquette, but I think that few habitual blog readers are newbie computer/internet users. Now the above phrase is a HORRIBLE one for a IA person to throw out without some corroborating data. I must say I haven't any data so, if anyone can point me to it, I would be very happy.

My hypothesis, still to be backed up by empirical data, is that the user experience is superiour when links to external site opened in a new window. The reasoning is that it allows the user to experience, and to explore the new site without losing contact of the original source material. If the external source is not interesting it is simple to close out the window. If the new site is interesting one can easily click through 10, 20 or more times and would now have a more difficult task returning to the original page than simply closing out the new window.

Tuesday, September 23, 2008

IE6 and IE6 Blocker

To what extent should we – as designers and site owners – stop supporting IE6? Pursuing through CSS-Tricks I came across the following software IE 6 Blocker which tells users that particular site doesn’t support IE6 and they must upgrade to a new browser. Now, I understand the frustration, but this is ridiculous . (See a previous post regarding the problems with IE 6.)

The first priority of all site owners and IAs is to make the content available to as many people as possible. If people want to use Lynx – well, let them. It is a basic violation of web conventions to tell the user what browser to use; as bad as accessibility and usability violations. I remember when IE first came out and Microsoft was thought of as a monopolistic entity about to consume the world and many sites put up "Netscape Only" splash screens. It was a bad policy then and is bad policy today.

Additionally IE 6 still constitutes about 20% of the market. What sense does it make to eliminate 20% of your users from accessing your site. IE6 will be a force until corporate and government agencies upgrade to a new platform. Only when large sites which cater to corporate clients, like CNN, stop validating for IE6 can I see smaller sites joining on.

The only way I can see dropping support for IE6 making any sense is if your web app requires scripts not available in old browsers. However this is a business decision that should not be made lightly.

A better idea, should you want to drop IE6 support, is to develop to web standards; then check IE6 for any major breakdowns that prevent people from using your site. After the major issues have been cleared do not allocate any extra development time for minor items. At this point it would make sense to flag IE6 users, put a small banner at the top of the page saying that your site is no longer fully supporting IE 6 and ask people to “Upgrade to IE7 as this site may not function perfectly with older versions of IE.”

Saturday, September 20, 2008

Usability Testing: Does a Warm or Cold Drink Change the Results?

There is an interesting program on BBC Horizon which explores how seemingly irrelevant externalities affect our decision making process. You can see the video here: How to Make Better Decisions

One of the experiments shown on the program indicated that you will feel more positive about a newly met person if you're holding a warm beverage. Conversely if you're holding a cold beverage you will feel more negative about that person. (Does it also imply a website?) This resonated with me as I just went through a hiring process and have been very dissatisfied with the person selected. If we had been drinking Red Bull instead of coffee would we have come to a different decision?

Thursday, August 28, 2008

IP Anonymity as regards P2P and Criminal Activity

It appears as if systems are far less robust, in regards to tracking users, than is perceived. A now "ancient" Cambridge study (from 2005) Anonymity and Traceability in Cyberspace by Richard Clayton concludes that the data stored by ISPs, while useful for business purposes, are less conclusive for criminal proceedings.

One should not be surprised that systems maintained by ISPs to provide traceability to ISP accounts become less precise once one is no longer using them for ISP purposes and start trying to trace back to actual people.

The authors stresses that not only would the traceability evidence have to be accurate enough to stand up in court but the prosecution would also have to account for numerous blocking and diversionary scenarios put in place "by the real" criminals and not necessarily the individual being charged with the crime.

If you're at all interested in the subject of internet privacy this paper is a worthwhile read.

Wednesday, August 27, 2008

A Brave New World - Is it Time To Freak Out

The following article was in TechCrunch earlier this month.

Is it officially time to freak out in Apple’s general direction?

Not only does Apple remove applications from the App Store—I Am Rich was taken down yesterday, for example—but it’s now emerged that the company can remotely disable applications from individual iPhones.

Obviously, Apple should be able to protect itself and its users from malware - but should it have gotten rid of "I am Rich?" From a public relations perspective the answer is "hell yes;" and from a general business perspective the answer is "of course." Any business from a small mom and pop to Walmart can determine what is and what is not sold in their store. They may make a mistake in allowing a product in, surely they have the right to correct their mistake.

There may be times when a company bows to public pressure but is that necessarily a bad thing?

Tuesday, July 29, 2008

A Brave New World

One of the beautiful things about increased computer power and bandwidth is the development of new apps and the ease by which one can add new technology for fun, life and business. One of the bad things is that others (hackers, companies and government) can know what you are doing. Your applications can be monitored without your knowledge or permission, with others knowing where you are and what you are doing in real time.

In time will purchased applications be removed or edited without your permission? Will newspapers, books and video be edited without your knowledge? The keeper of this knowledge (the company from which you download the information – and the government which has over site) can alter this data at whim. Without real care we can create a 1984 dystopian society.

I am very positive about the future of technology but we must take care to make certain that we keep our privacy, that the concept of people being “secure in their persons, houses, papers, and effects, against unreasonable searches and seizures” applies to on-line data as implied in the 4th Amendment. Computer companies (Microsoft, Apple) are now having their devices call “home” to detect piracy and malware. While I completely approve of this we must keep in mind potential problems that may come in the not too distant future.

Tuesday, July 1, 2008

Should External Links Open into a New Browser?

I’ve been having a debate with a client who wants all external links to open up in a new window. The reasons were two-fold. First it was a personal preference - he liked to have each website in its individual browser. (Why?) And second he didn’t want his site visitors confusing his site with someone else’s. (Huh?)

Here’s the interesting aspect of this conversation – the client is an intelligent, educated man, reasonably technologically adept who has been using computers since the 1980s and has been on the internet for over 10 years. And yet – he thinks this way?

Here’s what Jakob Nielsen has to say:
Opening up new browser windows is like a vacuum cleaner sales person who starts a visit by emptying an ash tray on the customer's carpet. Don't pollute my screen with any more windows, thanks (particularly since current operating systems have miserable window management).

Designers open new browser windows on the theory that it keeps users on their site. But even disregarding the user-hostile message implied in taking over the user's machine, the strategy is self-defeating since it disables the Back button which is the normal way users return to previous sites. Users often don't notice that a new window has opened, especially if they are using a small monitor where the windows are maximized to fill up the screen. So a user who tries to return to the origin will be confused by a grayed out Back button.

Links that don't behave as expected undermine users' understanding of their own system. A link should be a simple hypertext reference that replaces the current page with new content. Users hate unwarranted pop-up windows. When they want the destination to appear in a new page, they can use their browser's "open in new window" command — assuming, of course, that the link is not a piece of code that interferes with the browser’s standard behavior.

The only exception to this would be when linking to .pdfs or .xls files or some sort of non-html file.

Designers and site owners should not think that it is their business to decide when users should have a new window. It's their computer, their decision to make. Furthermore the whole point of hyperlinking is the seemless linking from one point to the next. Opening up another window breaks the connection between sites.

Another reason, less important simply because opening new windows is a bad idea, but very important in the coming few years is the increasing use of cellphones with small screens. Mobile devices don't always support multiple windows and even when they do the user experience is quite limited in keeping track of the open windows.

Thursday, June 12, 2008

Coupon Codes, Should we use them?

I've had a series of discussions with a client today regarding the use of coupon codes. He loves them, saying that they are a major marketing tool. I agreed that they were quite useful in building customer loyalty but had issues with two points being raised.

First, I felt that link from the email should contain the customer code; that way the customer wouldn’t have to go back to his email and enter the code. There was relatively no extra coding work so we could ease the customer experience at no cost to the client. The argument raised against this was the same raised by many retail operators – that the retailers count on customers not mailing in the rebate coupons; or forgetting to mention the coupon to the cashiers. The customers come to the store for the sale and end up purchasing the item at full price.

My second point was that customers seeing the coupon code, and the missed sale, would be alienated. It would be better for the coupon code to be invisible to new customers and instead display notices pointing out that if they signed up they could participate in future sales.

Unfortunately my points were overruled and a new clunky customer code system was developed.

Meta Tags and Title Tags

There is no magic bullet in getting your site to be ranked higher. It is doubtful that either the keyword or description tag helps with any major search engine. The tags were abused by spammers to such a point that, as far as page ranking is concerned, they are ignored. There is still some debate as to the value of title tags in page ranking. That tag, as with the meta tags are also being abused.

The value in the tags come, if at all, in the higher clickthrough from the search pages.

Content Management Systems cannot write useful META and TITLE tags without input from the user. Google and the other search engines start to ignore the META and TITLE descriptions when too many pages display the same information. Your CMS can be modified to help in the creation of these descriptions based upon directory information and other information given to the system. But even the most well-thought out system requires someone to tweak the final result.

Wednesday, June 4, 2008

Facebook and Third-Party Apps

I have my problems with Facebook, but the one thing they do very well, and for me the most appealing aspect of the site, is their incorporation of third-party applications. There are many options to choose from and it is constantly expanding: quizzes, travel maps, feeds and much, much more.

This willingness to add more functionality, especially third-party apps, gives me confidence that they will not go the way of Friendster. Looking at the popularity of Facebook's Travel apps; their willingness to accept 3rd Party apps; noticing that Google's PicasaWeb is superior to what Facebook has yet done; knowing that Google is becoming an App superstore; it's not unthinkable that the day will come when Facebook and Google will combine forces. Or one will swallow the other.



Thursday, May 29, 2008

Twitter and your business

Whatever you may think of Twitter, it is here to stay. The time is now to start figuring out how to incorporate Twitter into your marketing strategy. If you wait much longer the opportunity will have passed and you will be forced to play catch-up. Twitter's business has doubled in the last six months to over one million users per month and the trend has just started.

Spend a little time to go to Twitter's site; open up an account; and get a feel of what exists.

You can search for a word or phrase at Summize.com and Terraminds.com.

You can look for people at Whoshouldifollow.com and Twitdir.com

You can also download programs and follow Twitter from your desktop. Take a look at:
Madtwitter
Twitteroo
Twitterlicious

Wednesday, May 28, 2008

What separates Blogs from other Social Media?

The taxonomy of social media is complex. Social media usually refers to Facebook, LinkedIn, MySpace, Twitter and other sites. But in its widest sense social media can be any site that networks of people, including newsgroups, listservs and sites such as YouTube. YouTube is still primarily a file sharing site but with the addition of comments they are now more social media than file sharing sites.

Blogs are also part of on-line social networks but are different from the other social media sites in two ways. One person may have several blogs, each focusing upon a different topic but it rarely makes sense to have multiple Facebook or LinkedIn accounts. The second is the concept of time. In social media sites, such as Facebook and Twitter, it is the NOW that matters. Businesses may care to keep a record for legal purposes but most users rarely care about what was communicated 24 hours ago (if not 24 minutes ago). Whereas, with blogs, the continuity of posts matters. They are not focused upon the here and now.

All strategies used in attracting clients and consumers must focus first and foremost upon these two aspects, especially the concept of time.

Where's Dilbert when you need him?



What not to do.

Sunday, May 18, 2008

Search Engines and Dynamic URLs: Part II

Adding to the previous post there are two additional potential problems when using dynamic URLs.

Search engines have problems indexing URLs that contain session IDs. I would only pass session IDs in the URL in areas of the site which are not to be indexed -- such as password protected areas or shopping cart pages.

Session IDs cause problems with the SE bots. The session variables are different each time the bot lands on a the "page," giving the impression that the page has a new URL every time it is visited. This appearnace of duplicate content causes numerous problems, simply put Session IDs must not be visible to search engine.

A second problem with Dynamic URLs come in parameter ordering. The coders must be careful to order the parameter the same way each time else the search engines will have to juggle which "url" to use to go to the same content.

All in all dynamic urls are fine as long as no session variables are used in indexed pages and if the coders are consistent with their parameter ordering.

Saturday, May 17, 2008

Search Engines and Dynamic URLs

Too many people still think that search engines have trouble indexing dynamic URLs. For the most part this isn’t true. Search engines still have problems indexing URLs with more than three parameters. This happens because there are so many combinations that the bot gets stuck on the site and has to abort. This problem will lessen as computing power increases. In general URLs with one or two parameters provide no problems. They are spidered and indexed just fine.

Monday, May 12, 2008

Malicious Javascript and SQL injection attacks

From The H:

According to an analysis by Websense, the malware tries to exploit a total of eight security holes to pass malicious code to the visitors of the pages unnoticed, for example via the VML hole already patched in January 2007. F-Secure has monitored attackers who tried to break into .asp and .aspx web pages by submitting the page parameters in an encrypted SQL query:

DECLARE%20@S%20NVARCHAR(4000);SET%20@S=CAST(0x4400450043004C0041005200450020004[...]

Once decrypted, this is an SQL query designed to find all the text fields in the database behind the web page and inject JavaScript code into them:

DECLARE @T varchar(255)'@C varchar(255) DECLARE Table_Cursor CURSOR FOR select a.name'b.name from sysobjects a'syscolumns b where a.id=b.id and a.xtype='u' and (b.xtype=99 or b.xtype=35 or b[...]

Both code snippets are only the beginning of the request. Administrators of servers delivering .asp or .aspx pages (like Microsoft's IIS) are advised to check their log files for similar entries and if necessary search their databases for injected links.


Ah, the ingenuity of mankind. It is impressive isn't it?

Tuesday, April 22, 2008

Mosaic's 15th Birthday

I suppose it's official. I'm an old-timer now. I was infatuated with the possibility of the internet when there were still only Gopher sites. Archie and Veronica were the cool search tools of the day. As with other "pre-Mosaic" techies this day, 15 years ago today, was when the world changed. Mosaic, the first graphical browser was "officially" released.
I learned HTML that summer, and for me, my life changed.


Friday, April 18, 2008

Launching New Windows: Part II

Almost a year ago I wrote that I thought that users would soon be comfortable with sites that launch new browser windows. When a link takes the user away from the current site to completely new information there are times it makes sense to open a new window. An example of this would be an intranet where a user is tasked with learning a new product line. There are times when it is expected that the user may click numerous times while exploring the new product line. After finishing examining the product how will the user get back? Back links do not always work as there may be numerous places from which the user came. Will the user click the back button 20 or 30 or 40 times? Will we expect this user to go into his history and find where he was? That's not necessary if the other window is still open. Opening new windows is analagous to someone opening up yet another book on his desk and flipping through this second or third book while still having the first book opened.

It is possible to come up with another method aside from opening up a new window? Yes we can pass a parameter with the "original" page and add it to the navigation. I struggle to see how this is a better method. It would not be. Even expert users would get lost in an ever shifting navigational schema. Following the "open book" metaphor opening a new window, making it smaller than the standard window to stand apart is a clear and simple means of providing distinct data to the user.

And yet a year later I still see intelligent, highly educated, but unsophisticated computer users getting lost using tabs and not using tabs to organize their browsing. As of now Jakob Nielsen's warning regarding the user of new browser windows stands.

Friday, March 14, 2008

The Coiner of the Phrase: "80/20 Rule" Died

Joseph Juran, 103, Pioneer in Quality Control, Dies

It's funny how often I've said, or heard, the phrase "80/20" and only now realize that the Pareto Principle was coined popularized by Joseph M. Juran. Pareto, I've just learned, was a Classical Liberal Economist of the 19th C who made the observation that twenty percent of the population owned eighty percent of the property and this truism held across time and culture.

  • 80% of a company's profits come from 20% of their customers
  • 80% of their complaints come from 20% of their customers
  • 80% of crimes come from 20% of the criminals
  • and

  • 80% of users use only 20% of the features
  • 20% of software bugs create 80% of the errors
  • The 80/20 rule may or may not be another "Golden Ratio" myth but it has certainly entered the lexicon.

    Monday, March 10, 2008

    The Privacy Paradox

    Forbes has an interesting article regarding privacy.

    "Consumers express a lot of concern about their privacy online in surveys. At the same time, very few engage in privacy-protecting activities," says Leslie Harris, executive director of the privacy advocacy group Center for Democracy and Technology. "There's a real inconsistency."

    ...

    So, do users value the convenience of having a search bar beside their e-mail more than protecting their sensitive information? "There's a growing market pressure to do right by privacy," insists Harris. "But do I think the market alone will fix this? No. At the end of the day, we still need legal protection." Part of the solution, she says, may be a national privacy law.
    The Privacy Paradox

    I agree - part of the solution may be privacy laws. Facebook ought to have no more say over your data than does the journal in which you jot down notes.

    Thursday, February 28, 2008

    Another Way to Save Articles

    I just found out about a new web application - Instapaper that seems wonderful. Marco Arment, the lead developer, describes the project in the following way:

    You come across substantial news or blog articles that you want to read, but don’t have time at the moment.

    You need something to read while sitting on a bus, waiting in a line, or bored in front of a computer.
    Tech Crunch

    This is perfect for someone like me. My bookmarks are completely out of hand; I often scrape articles I want to read and when I most want to read them I am away from my computer.

    My only fear is quickly bumping up against storage limits (both in space and time span). Here's hoping this site takes off.

    Tuesday, February 26, 2008

    Which Meta Tags Will Get My Site Rated Higher?

    Fortunately or unfortunately there are no major META tags that will help your site be ranked higher. Anybody that says so is trying to sell you something.

    Several years ago there were two META tags whose expressed purpose was to help you describe your site.

    <meta name="description" content="Place Description Here" />
    <meta name="keywords" content="Place Keywords Here" />

    Developers and SEOs (Search Engine Optimizers) quickly started to take advantage of the tags. One of the more eggregious ways that people took advantage of these tags was by putting highly ranked search terms in the tags even though those terms had nothing whatsoever to do with the site in question.

    The logic behind this was that people would find and see the site even though it had nothing to do with what the people were searching for. This may or may not have been effective for the sites who were abusing the system but it annoyed people who were using the search engines. It also annoyed the decision makers at Google and other search engines. The result was that these two meta tags became deprecated: search engines no longer valued the information in these tags.

    Once again, anybody who says there are meta-tags that will help your site be rated highly is -- at best -- many years out of date.

    Is there anything you can do to get your site ranked higher? Yes, there are many things that can be done but none of them include META tags.

    Wednesday, February 6, 2008

    If You are Into Geeky Humor:

    This guy has some great comics: xkcd. It's worth going through the archives.