Wednesday, December 19, 2007

SERP: user selection of results

Q: How will the users choose among the search results?

Jakob Nielsen: From the user perspective … the top one [result] is going to be the one that ... they ..will judge to be the best and that’s what people will tend to click first, and then the second one and so on. That behavior will stay the same, and the appearance will be the same, but the sorting might be different. That I think is actually very likely to happen.

Interview with Jakob Nielsen: Future of the SERP

A shame, but true. Very few people go through the search engine results scanning for other titles. Look at users browsing through music stores or book stores. They are quite willing to keep looking past the first few titles. There is not the same engagement in web searching. What is the reason for that? Is it the tactile sensation of picking up a CD or book? And how is that going to change when all music is downloaded from the web and book go the way of vinyl?

How do we bring that tactile response, that love to accumulate and touch to the web screen? I wonder how much using a mouse instead of a touch screen changes things? And to what extent are we impeded by the low resolution screens? Higher resolution screens would allow us to put a lot more secondary information on a screen, perhaps enticing users to keep searching – because the search would be much more interesting than simply reading a few characters of text.

SERP: more relevant display

Q: What changes will there be in search results pages over the next 3 years?

Jakob Nielsen: The big thing that has happened in the last 10 years was a change from an information retrieval oriented relevance ranking to being more of a popularity relevance ranking. And I think we can see a change maybe being a more of a usefulness relevance ranking. I think there is a tendency now for a lot of not very useful results to be dredged up that happen to be very popular, like Wikipedia and various blogs. They’re not going to be very useful or substantial to people who are trying to solve problems. So I think that with counting links and all of that, there may be a change and we may go into a more behavioral judgment as to which sites actually solve people’s problems, and they will tend to be more highly ranked.

Interview with Jakob Nielsen: Future of the SERP

This is one place where SE need to make drastic changes. I don't see any indication of things happening quickly but this is one of the current technology's weak spots.

As mentioned in the article columnar presentation of search results may improve the situation. Users will continue to scan results, with the majority of users giving higher relevance to top ranked returns, BUT if paid searches and content aggregated sites such as Wikipedia can be sorted out from other returns it would be a useful incremental change.

Will Google make such a change? Only if they see a ROI. Until a competitor forces them to do so I doubt we will see much of a change in the short run.

Monday, December 3, 2007

Inspirational Sites

There are two sites I have to share this holiday season. One is a fashion blog: The Sartorialist. The Sartorialist worked in the fashion industry for 15 years and now he walks through city streets (NYC, Paris, Rome) and takes pictures of ordinary people with extraordinary style. Even if you're not into fashion this site will peak your interest.

Another wonderful site is The Cool Hunter which runs the gamut covering industrial design to architecture and more.

Both sites are great when the right side of your brain needs a pick-me-up.

Wednesday, November 28, 2007

You think you have it bad?

Having a bad day at the office? Think you have a chatterbox next to you? Take a look at this:

Sunday, November 18, 2007

Disabling the Back Button

For a long time the quest to prevent use of the back button was seen as one of the more reprehensible actions done by web developers, second only to the onUnLoad pop-up. I was one of many people who argued that interfering with basic browser functions was 'evil.'

Nonetheless I, as with many other usability professionals, have changed my mind. There are legitimate uses for wanting to prevent the user from using the back button. For the most part there are alternative methods of solving the problem. An example of this is a banking page. After one logs in, transfers funds, then logs out, no one should be able to hit the back button a few time and see what you were doing. Thank fully one can solve this problem without interfering with the back button – but one does interfere with the user’s history.

However there are many online applications where users fill out form information and should the user exit the application – usually because they've been interrupted – then they would lose all the information already entered into the forms.

Can one prevent the user from leaving the current page? No, and this is a good thing, otherwise you might be trapped on web page without an option of leaving. Nonetheless one can force the browser to display warning messages and give the user a chance to change their mind -- preventing users from accidentally losing their work.

What’s important is conveying to the user the importance of not leaving and making certain that the directions are clear.

Wednesday, September 19, 2007

Links: How clearly laid out need they be?

To what extent should links be declared and clearly visible to the end user? Certainly links need to be differentiated within text but do links have to be clearly laid out everywhere? Can we assume the user will suspect that links exist if things are not explicitly laid out?

Most sites today, even sites made for the general reader have numerous linking methods. Is this a sign that users have passed the Jakob Nielsen threshold? Many have but I’ve seen many users never click on links. They look at the page and never even think of passing their cursor over text to see if a graphic or text is a link.

We assume that EVERYONE knows that an undifferentiated column of text would be thought of as a series of links. Not so. Clearly younger users seem to have an “intuitive” grasp of the possibility of linkage. It’s not intuition it’s curiosity and they lack the older users’ fear of “doing something wrong.”

Sunday, September 16, 2007

Yelp - What are they Going to Do?

There's been a lot of disturbing news about Yelp recently. Yelp, if you're not familiar with it, is a website which allows people to review stores, restaurants and other companies. The problem is that some store owners post bogus positive reviews and, so some allege, post negative reviews on their competitors.

If you’re interested in game theory the Yelp example is bound to be fascinated to see how this scenario plays out. Adding to the problem is that Yelp is removing bad comments should a store advertise on Yelp – or so some say. As someone who is affiliated with stores on Yelp I have to say that I haven’t experienced that.

This posses an interesting IA problem: How should Yelp, and by extension other review oriented sites, deal with this problem? You can never stop people from creating multiple accounts but you can give added weight for community participation and then give added weight to accounts which give more and more evidence of legitimacy.

There are sites which I’ve reviewed that have dozens of reviews. Each of these glowing reviews are from accounts with only one review. This makes the reviews more than a little suspicious. One way to stop the abuse is for sites to have no more than a few reviews from single review accounts. The second way was alluded to earlier – give added weight to accounts which are considered more legitimate by counting the number of reviews, friending other users and through general community involvement (at its simplest this can be done by noting the times the account logs in, page views and time spent in the community.)

Monday, July 30, 2007

SEO A Tutorial Your First Step

I'm often asked about SEO tips and tricks - namely how does one begin: does one need to know HTML? does one need to have a marketing background? What does one need to know to promote ones website?

Ultimately it helps to know some HTML and to have some background in marketing but it is most important to understand what Google - and other search engine companies are trying to do and how they are doing it. To begin with let's look at the ideal scenario. Milliseconds after a webpage is is uploaded the search engine finds and evaluates the content of the page and correctly displays the page in order of relevance to the user searching for the information.

Currently this "ideal" is only partially met with heavily indexed sites such as CNN and other news sites. New files are indexed and evaluated remarkably quickly. And yet the two main points need to be kept in mind:

1. Speed and quality of result are at odds;
2. The relevance of the result is not / cannot be perfectly graded for every person and every query.

Thirdly search engines are still not good at (but will shortly) in determining originality. By originality I don't mean it a creative writing sense but in the search engine "knowing" the originator of the content. This is gaining in importance as a result of increased site scraping.

Once you know what a SE is trying to do you need to start thinking about what you can do to place higher. Knowing that news sites are crawled multiple times a day shows that you need to consistently add more files to your site. Knowing that SEs are quite fallible in determining the relevancy of your pages means that you must do your best in aiding them through the use of keywords, titles, urls, and many other big and little things such as getting in bound links and the proper use of heading tags.

These tips and tricks can be easily picked up over time but nothing counts as much as consistently adding valuable information to your site and being frequently indexed.


Thursday, June 28, 2007

What is XHTML?

There are times, when discussing HTML, that XHTML enters the discussions. A lot of clients have expressed confused ideas when it comes to XHTML.

There are two reasons for the development of XHTML. The first is clean up, and remove, vestigial code. As HTML developed a lot of standards and code snipets were proposed, some never got off the launch pad, some were developed and died on the vine and others remained and are in existence today. XHTML is a new standard that will better organize the standards and remove unwanted code.

The second reason, and one that excites developers, is the ability for each developer, each company to extend the language as needed. In affect XTHML promises two seemingly contradictory goals. The first is that it would remove the dead weight of past mistakes and second it would allow each developer, each organization to make its own code - and hence their own mistakes; their own dead weight of no longer needed code. Of course, that means that each developer, each organization, can also keep their code libraries in order and up-to-date.

What does it mean to you the business user? Nothing at all. If XHTML is adopted as a standard your organization’s existing code will still work. It would mean that the next time one does a site redesign the developers would start incorporating XHTML elements. This is a coding issue and is something that should not be a concern to the business user anymore than any other coding issue. It will be years before XHTML becomes a standard and more years after that before your site becomes "unusable." For all we know XHTML will never be adopted.

Monday, June 18, 2007

Launching New Windows

I’ve not been as opposed to opening new windows as Jakob Nielsen. If you have a site that is populated overwhelmingly by expert users then there are many very good reasons to open new browser windows and little-to-no downside.

To the extent that Nielsen is talking about ads or gratuitous using of “new technology” for its own sake I agree with him. There are also some sites which open new windows for every link - these are not professionally developed sites and for that reason I don’t consider them as part of this discussion.

To the extent that you have a site that is populated with average users I agree with him. One of Nielsen’s missions is to constantly remind computer professionals – developers, designers (as well as Usability Professionals) that the average user gets lost a lot more often than we do AND gets confused, bewildered and frustrated when lost. And that’s important, angry frustrated customers don’t come back; confused and bewildered staff waste time and are resentful that they’re forced to figure something out.

Still the mantra that one should NEVER create new windows is, I predict, soon going to be obsolete. Now that IE 7 has come out with tabbed browsing it will soon become part of virtually every web user’s repertoire.

The questions that will then need to be answered over the next few years is: how many people take advantage of the tabs? And does the use of tabs make opening new windows more acceptable, as predicted above, or does the average user continue to be thrown by use of opening new browser windows?

Saturday, March 31, 2007

Privacy, Social Media, Censorship and more

I just read several papers presented last June at the 6th Workshop on Privacy Enhancing Technologies in Cambridge, England. They were fascinating reads and are highly recommended.

Imagined Communities Awareness, Information Sharing, and Privacy on the Facebook focuses upon the quantity of data that is unwittingly shared on Facebook and other social networking sites.

Some [users] manage their privacy concerns by trusting their ability to control the information they provide and the external access to it. However, we find significant misconceptions among some members about the online community’s reach and the visibility of their profiles.

Ignoring the Great Firewall of China examines how governments, not only China's, prevent access to websites. Sometimes there is a bona fide law enforcement aspect of it such as access to child pornography and in counter-terrorism. Regardless of the legitimacy of a particular action, anyone interested in privacy issues ought to be aware of the activity.

The so-called "Great Firewall of China" operates, in part, by inspecting TCP packets for keywords that are to be blocked. If the keyword is present, TCP reset packets (viz: with the RST tag set) are sent to both endpoints of the connection, which then close. However, because the original packets are passed through the Firewall unscathed, if the endpoints completely ignore the Firewall's resets, then the connection will proceed unhindered. Once one connection has been blocked, the Firewall makes further easy-to-evade attempts to block further connections from the same machine. This latter behaviour can be leveraged into a denial-of-service attack on third-party machines.

Friday, March 16, 2007

Trackback Blogger and Spam

I'm a big fan of trackbacks, at least in theory. A trackback, for those who may not be familiar with them, is a widget which automatically lets one reference a blog or webpage and automatically let the owner of the other website know that you have referenced their page.

Trackbacks, as with links, help readers find blogs and websites of like mind and interests. Trackbacks help authors know which of their posts have generated the most interest.

My one fear is that trackbacks will soon be under attack from spamsters and that this wonderful widget will be rendered useless.

Wednesday, March 7, 2007

How to Create a Safe Password

It's easy to come up with hack proof passwords that are easy to remember. The one thing you must keep in mind is that sometimes companies limit the characters that you can use for a password. This is very silly on their part as the longer the password and the more options you have(upper and lower case, numbers, special characters) the more secure your password is.

Too often I've come across sites which limit my password to 8 or less characters, which don't distinguish between upper and lower case and don't allow for special characters. I recommend coming up with a simple alternative password algorithm for these sites.

The following is an example for an 8 characters password. I would recommend having longer passwords - at least 12 characters for those sites you are most concerned about.

1.Select a phrase — "It was the best of times"
3.Take the first letter of each word or number — IWTBOT
3.Change some letters to numbers — 1WTBOT
4.Add special characters — 1WTB@T

And now it is best to make each site's password unique. You can customize it by adding the first two letters of the site in lower case.

5.Customize by adding a prefix or suffix for each site you register with. For example your Blogger account would become 1WTB@Tbl. It makes remembering very simple: "It Was the best of times blogger"

Friday, February 16, 2007

Why Smart Executives Fail

I just finished Why Smart Executives Fail by Sydney Finkelstein. I think it is as important a read as The E Myth. I loved the section on Webvan. (I remember reading about it as it was gearing up and thinking it was a great idea.) Webvan was to take on the supermarkets and drive those dinosaurs out of business. Customers would order their items on line and have it delivered the same or next day, and instead of a 4 or 6 hour delivery window there would be a 30 minute window. Billions were spent; Bechtel was hired to build distribution centers; competent, experienced directors were hired and yet Webvan was out of business barely 2 years after opening shop.

Finkelstein, in this example, as in many others, showed the difference between "great expectations" and outcome.

"Even today, writing about Webvan, it's easy to get excited by the vision. But the business model was fatally flawed from the start.

The supermarket business is notoriously low-margin to start with, so where was the money goint to come from. ... Throw in free home delivery, and you've got to become the Superman of productivity to make a profit. ... When you add up the costs of building the Webvan infrastructure - easily $1 billion plus -- it's hard to see how the numbers can add up. Now imagine that there are literally tens of thousands of competitors - call them supermarkets - that can easily add home delivery, that do not need to spend millions (let alone billions) to do so, that already have customers and market presence, that offer products of essentailly equal quality to your own, and are - surprise! - not really dinosaurs after all, and you have an idea that is getting less good with each passing minute.

Monday, January 22, 2007

Information Architecture Models Information - Not Relationships

The problem is that IA models information, not relationships. Many of the artifacts that IAs create: site maps, navigation systems, taxonomies, are information models built on the assumption that a single way to organize things can suit all users…one IA to rule them all, so to speak.

Thoughts on the Impending Death of Information Architecture

I could not agree more with the first sentence. Not enough thought is placed on the user's wants. We understand why this is so. We, the Information Architects, are surrounded by the business users, they are the ones that provide us with work; they are the ones that provide the goals and we, for better or worse, work off their understanding of what the customer needs.

Too often the business users have distinct wants that clash with IA goals. Ultimately the business users, who are writing the checks, make the final decision. An example would be users who want to go to your site for a quick piece of information and then leave. The business users on the other hand want to make their site "sticky"; or they want more page views so an unnecessary landing page is put in. These impediments help the business user meet his goals but often time hinders the user.

I have to disagree sharply with the following sentence that "information models [are] built on the assumption that a single way to organize things can suit all users." No Information Architect that I have ever met thinks this way.

The problem stems, not from IA in the abstract, but that IA is part-and-parcel of the business world. It's not that business is opposed to IA goals, only that IA is part of a whole and therefore, there are times, for good reasons or not, that other rationals trump usability.

Sunday, January 7, 2007

When Do You Use a Table versus Divs?

When do you use a table versus divs? Some developers hate tables so much that they waste their time and effort creating divs when tables would do. It's simple - if you need to match up cells in a row across the columns then you need a table. If not, then divs would probably do.

Friday, January 5, 2007

Browser - Backward Compatibility

In designing a site - which browsers should one design for? At what point does a website owner stop sinking funds developing and testing for out-of-date browsers? I don't have a clear cut answer for this. Is it solely market share? No. A browser may have a tiny market share but be web standards compliant. This means that it takes relatively little effort (read money) to make certain that nothing is broken and the site renders well.

It used to be that browsers ignored standards trying to create proprietary standards. Those days are gone forever but the browsers remain and some, such as IE6, are still running strong. This allows us to rephrase the question. At what point should site owners and developers stop validating non-web standard compliant browsers?

It is simply a matter of math. How much are you willing to spend validating a browser for 0.1%, 1%, 10% of your market? I would say that when the browser has declined to under 10% of your site usage that it would make sense to ignore minor visual inconsistencies and focus resources only on issues that prevent visitors from using your site.

EDIT 9/7/2010: